Making this here because Yas Forums is a shitfest.
In all seriousness, are these specs wildly different? As in will there be any describable difference outside the numbers and in actual gameplay?
Making this here because Yas Forums is a shitfest.
In all seriousness, are these specs wildly different? As in will there be any describable difference outside the numbers and in actual gameplay?
>are these specs wildly different?
no. ps5 has better ram and ssd though
Specs don't matter at this point. We have massive diminishing returns in graphics already. Developers should focus on the quality of animation and gameplay, but they won't
No. Buy a pc.
Yeah if the PS5 is not SMT has 8c8t, then the xbx is essentially PS5 Pro at launch. The PS5's 10 TFLOPs are measured at max clocks, which will gladly drop during gameplay.
SSD - don't mind them, the cheapest pci-e SSDs are already fast enough to make the CPU choke.
I'm genuinely surprised at these specs and for sure do not expect that level of perf in a $500 console.
only thing that'll matter is launch price desu
microsoft can operate at a loss and still come on top
this
Xbox has more graphical power and ps5 has a much faster ssd almost twice as fast.
The question comes down to which is more important for next gen. I don't know for sure, but I will say the massive ssd advantage will be more important than the slightly better graphics.
12 TFLOPs is literally a 2080 Ti or an OC'd 2080.
prove it without meme tflops
PC fags have been btfo and are scrambling. These $500 machines are going to give 2080ti/vega64 level performance while they have to spend AT LEAST a grand. It's over for PC.
even if that comparison is real, I doubt that the processing power will be used towards improving framerate
en.wikipedia.org
techpowerup.com
Of course, this is the absolute maximum performance, which these GPUs almost never reach. XBX will be optimised to the last instruction.
The XBX is literally an undervolted 3700 with an undervolted and slightly underclocked 2080 Ti. Look at the RAM bandwidth and bus width.
They already confirmed that 4k60 is the standard, at least for xsx.
at launch, consoles always have best price/perf
but they get BTFO just after a year
Same GPU arch, fewer compute units. If anything, worst case scenario is a PS5 exclusive being ported to XBX - just allocate more shaders and the XBX will compute more data faster.
we'll have to see if they actually deliver
>These $500 machines are going to give 2080ti/vega64 level performance
they said that like six years ago and the result was high-detail games that struggle to maintain 30 FPS
It took years for the GTX 1660 to come out and just match, not beat, the 780 Ti in benchmarks. There is nothing on the horizon in the next 2-3 years on PC until the "Pro" consoles come out that can match a 2080 Ti.
imagine being old enough to post on Yas Forums and still playing video games
The "variable frequency" part is very sketchy, in actuality the PS5 could be way slower than it seems.
I don't think the TWICE AS FAST SSD will matter at all. Let's say it takes 5 seconds for the Xbox to load all the assets on a level: assuming perfect scaling, that would be 2.5 seconds on the PS5, almost no difference at all. Up until now devs had to deal with 5000 RPM HDD and had to implement ways to hide 3-minute long loading screens, a 2.5 second difference won't matter at all.
>if the PS5 is not SMT
They confirmed it has SMT.
No, PS4 and Xbox One were garbage from day 1
That 3.8ghz though. It's not even worthy to be memed by that 3.9ghz bruh memer
/thread
>he thinks posting on Yas Forums is a sign of maturity
>TWICE AS FAST
PS5, being on a freebsd distro, will easily support filesystems with LZO compression. So you get on average 40% reduced filesize at no noticeable performance loss. Meanwhile xbox windows is closed source and can only use microsoft's shitty proprietary zip that works like garbage in windows.
> are these specs wildly different?
No. 15% difference, Xbox has better GPU but worse RAM similar to ESRAM fuckery, plus PS5 features some NVMe RAM cache of sorts.
BTW Sony haven't announced their "Pro" version unlike MS.
> PS5, being on a freebsd distro, will easily support filesystems with LZO compression.
NTFS had compression since forever and it worked on PIII.
>microsoft's shitty proprietary zip that works like garbage
>for sure do not expect that level of perf in a $500 console.
I did. Selling consoles at a loss, or at very best minimal profit, has been a thing for years. It's in games and services that they make their money, because the console might as well be a doorstop without them
>the cheapest pci-e SSDs are already fast enough to make the CPU choke.
Ryzen doesn't have mitigations that wreck I/O performance
>all of this garbage 4k and 8k talk
Will the ps5 finally be able to do 144 hz? Resolution after 1080 (arguably 1440) means jack shit.
Intel doesn't, either - everyone turns them off. Show me a single game that utilises more than a burst of a few hundred megs a second. CPU needs to decompress and process that data, it's not just CSV shit.
ive seen rumours that gran turismo 7 is aiming for 120fps
No, 120 and 144Hz are niche refresh rates. The hardware is there for 1080p 144hz easy, but games are usually locked at 60.
What are these Zen CPUs? Are they x86 compatibles? Or something new?
Will these systems be backwards compatible?
They're amd64, sadly backwards-compatible with x86.
microsoft said they're doing it so there's a good chance sony will as well
both will offer freesync capabilities as a selling point as well
I can clearly see a difference between 1080p and 4k. 8k is unnoticeable from 4k however.
And no, they're capable for 120fps but the standard will be 60fps.
x86_64 with custom decompression blocks and probably a weird north/south-bridge setup (see ps4)
might grab a console (ps) if I feel like playing some exlusives that will literally never be ported to pc or just shit that work better on consoles I guess
and depending on what they cost obviously
Also timers. The PS4 was essentially not a "PC".
Fantastic short talk on the subject
youtu.be
I have two freesync monitors and an nvidia GPU, how badly did I fuck up?
>giving a shit about any of this when the xbox will have no exclusives worth playing
Do you people never learn?
>hurrrrr I can clearly sea a deefrenese
Shut up nerd. Enjoy you $1000 gpu and garbage frame rate.
>t. 1440p 144hz CHAD
all I want is a 2200MHz RDNA2 dGPU
>I don't have a machine capable of 4k so I cope by saying there's no difference
The post.
>Chad
Confirmed incel.
>scrambling
No, fucking relieved.
Hopefully this will shove a stick up the market's ass, fucking price gouging little shits.
Yeah, I can see the difference too, but frame rate matters more to me. I just haven't seen any confirmation on framrate for specifically the ps5, which is why I asked. Really wish even 120 was confirmed
Didn't read a single word but keep crying as I'm certain you are
THE 10 TERAFLOP BARRIER TO ENTHUSIAST GAMING HAS FINALLY BEEN BROKEN
WE CRUSHED THE COMPETITION
Here's a 1080p screenshot of fallout new vegas.
Look at the details on the tree.
Do you like how detailedly rendered the tree is?
Pretty good huh?
It's downsampled from 5k.
That's a nice cope sweaty. Have fun with your garbage can if a game. Show me a real good game like rdr 2 at 5k. I'll wait. Show me fps too.
>good game like rdr 2
Yikes
Good point though it’s still more graphically intensive than a Fallout 3 mod
>Yas Forums is a shitfest so I will post my shitty console war thread here instead
kys, go back
This is a laptop GTX 1650 running that scene at 70-80FPS, mainly because the engine is trash. It was a visual test and you passed lol.
RDR 2 will gladly run at 5k, probably on locked 30FPS on 2080 Ti.
>using a superior 360hz 1080p monitor but setting renderscale to 4k+ is the true redpill?
native 4k displays btfo'd?
why sony wasted resources on making a ssd controller?
just stick on a generic pcie4 ssd and call it a day.
put money into the gpu instead.
blunder of the decade.
For comparison, a native 1080p screenshot. If the choice is between super nintendo at 60fps and a 2020 gaming experience, it's obvious.
This
+ games are optimized for console and run better with similar to pc hardware
PC will be for kids with a lot of time or for those playing mmos or online shooters all day
all modern games have supersampling settings right?
havent gamed anything recent in like the past 10 years
supersampling old games would demand some mods probably? or does nvidia/radeon settings have shit like that built in now a days?
t. ancient gaming boomer
still looks pixelated
Rdr 2 is the greatest game ever made. Seethe weeb faggot
I literally didn't open either image so how did I pass a visual test?
Supersampling is just a concrete algorithm. It's dog slow and smudges the image. What I'm using is gedosato at lanczos scaling. Look in the /vg/ archives for /dsg/ or down sampling general.
>how did I pass
You suppressed the truth the ego knows, that higher res is better. Probably linked to childhood sexual trauma.
This is already taken into account in the specs they announced though, isn't it?
Radeon user here, it's built in amd.com
Sure but it's essentially a maxed out 1080p. That is the best image a 1920x1080 monitor can display. Downsampling quality scales exponentially with resolution increases. 5k is 9 times 1080p. 9 times 1440p is 18 times 1080p. But 1440p is twice 1080p. You get the point.
GeDoSaTo is more configurable than VSR/DSR from the drivers.
this is some serious projection user. Whomst hurt you? Show me where so I can put my dick there too.
No shit the higher res is better, but it's not worth the fps loss. You aren't getting 100+ fps at 4k in any decent new good game like rdr 2. Especially not for a decent price. High end gpus are a meme for retards like you.
Now keep replying; from here on out the only response you shall receive from me is to stay btfo.
>That is the best image a 1920x1080 monitor can display.
but it's not. a supersampled downscaled image even looks better at the same res lol
ok ty just wondering havent played any games in fucking ages
>I don't think the TWICE AS FAST SSD will matter at all.
This _might_ make a difference for games which rely on streaming of content and it _might_ make a difference when a game needs to go above 16 gigs of memory ( or whatever is available for them ) as faster storage means faster virtual memory access/throughput.
In essence - what would be seamless on PS5 might end up choppy on XSex.
Some games aren't worth the FPS. You don't take advantage of it while riding a god damn horse, and for firefights there is already auto-aim and slow motion.
>looks better
It doesn't. When you pass 4 times your resolution, you get diminishing returns. It's simply the way lanczos sampling works. SSAA is just a shitty downsample wannabe.
They are roughly equivalent to themselves and to a mid-tier PC. So they are better then 90% of peoples computers if legacy hardware is taken into account.
AMD is going to make a killing on those. I hope they release an APU with those specs.
mid-tier PCs (1660s and 570s) have half the teraflops and 6c6t or 6c12t CPUs at best.
>It doesn't
sorry I thought he was quoting the 1080p pic nvm im retarded
3700x + 2080s equivalent apu would literally eat the whole dedicated gpu market. literally no need to buy anything more for the majority of gaymers
Gaming PC wise - that is low end. Mid is 2070/2080 and 8/16 ryzen or 6/12 high clocked intel.
>3700x + 2080s equivalent apu would literally eat the whole dedicated gpu market
3080ti at $699 confirmed
No fucking way. If you stack together all current and last gen GPUs and pick the median, that's mid-tier. Which happens to be 1660 and 570 going from rx 550 to the 5700xt.
still wouldnt be worth it
1 thread, 1 CPU will always be the ideal scenario
more than 1 thread is "clogging" potential
>last gen
Last gen is irrelevant. Current, non legacy hardware is what matters - because that is what those consoles are to be compared to. Current mid-tier is 5700/5600, 2070 and so on.
A midrange PC is what you can buy with ~$800. A 2080 alone is 80% of that budget...
Stay btfo
Leave that one alone he has an ego complex and should just be ignored.
You can still buy an assortment of GPUs and CPUs on the market. If it's on shelves, it's not irrelevant. Not to mention unrealistic, as the whole world recommended OC'd 570s until a month or two ago. Consoles have great hardware and that's their advantage.
So mathematically speaking, budget is rx 550, gt 1030, and vega 11 (*400G). Low-tier is 560, 1050 (and undervolted Ti), 1650 non super is pushing it. OC'd 570 and 1660 can already blast through most games on the market 60fps 1080p. High tier is your RTXs and the new RX cards.
Memes like Quadros, TITANs, Vegas, and RADEON VIIIIs are not for gaming.
4th gen i7 and 970 here, still maxing out everything at 1080/60.
>imagine... post[ing] on Yas Forums
Fixed
Oh and CPU-wise a 3400G playes everything period. 1600AF is for futureproofing. Everything above that is just prestige.
More like $500-1500 and by december ( release of new consoles ) 2080 is either going to be replaced with a newer model or a hard price-cut is going to be involved as Amperes and big Navis are introduced.
Besides the point - 5700XT already has the fabled ~10gflops and is mundanely mid-range. Likewise. Midrange cpus ( 3600 ) while having less cores will boost to higher clocks. $800 build with specs similar to those consoles is fully possible today.
>1080p
>maxed out
Get with the times, boomer.
No you're not 3.5.
Although I do have an i7 4790k with my 5700 xt. At least until Zen jumps again. Although with how mediocre and average the new consoles are i could really just not upgrade
Probably more, as the Nvidia 2080 TI is and "alround" card. So getting performance out of that, is dependent on coding optimization, DirectX efficiencies and what not. the trouble is, that in PC world, a game engine has to work on tons of cards, and the GPU API has to scale across them all. And think about to "politics" in it. How the witcher 3 had 64x hair tesselation, which did little to nothing for the overall graphic result, but was a bigger burden for AMDs cards. So even thou Nvidias card could have performed more FPS without it, they did it with less FPS so that the margen between Nvidias offers and AMDs was bigger. All this leads to the fact, that TFLOPS rarely is a good meassure.
On a console however, they only have one(!) hardware setup for each one. So in order to get the best press, they don't worry about anything else, than getting the most performance from that system. So if the Radeon chip can do 12 TFLOPS, they'll be working on how to extract all of it. It might take some years to get there, and not all games, but big AAA tittles will probably aim for it. I have no(!) idea on what the differences is, but i gues you could widly estimate that on highly optimized game engines for a single console architecture, you could get some 10-20% more performance out of the TFLOPS. But this is all very much out of my reach, you'll need a couple of years of expierency in this, to judge it.
The 5700XT is the top AMD card right now, so not even locally midrange. When you save enough money for a PS5 build in a few years, the PS5 Pro will come out. Lol.
>1tb nvme
>8cores zen2
>2080s + rtx features equivalent
>800$ now
no
I actually think the SSD will matter a lot. They sure is focusing a shitton on it. And if it was just for storage, why is there both internal and external storage possibilities so readily marketed?
I'm expecting the SSD to be a major part of the system memory as a cache. With the speed gen4 pcie SSDs potentially offers, you could call it L5 cache. And i don't believe for a second they'll get slow, last gen types. They can buy in bulk in was nobody else can when it comes to SSDs, so they'll probably have a great price.
I agree, AMD and Nvidia are being kikes selling overpriced cards right now. The 5700XT should have been a $250 card just like the RX 480 8GB in 2016.
The best SSD's on the planet work in tens of microseconds. The best DDR4 RAM on the planet works at ~20 nanoseconds. No, you cannot and will not use SSD for RAM. GDDR6 memory as RAM is a bullshit idea centred around cost savings as it is.
Even 5 terabytes per second RAM will be useless if your CPU has a dozen megs of L3. Access latency is what matters.
I see the value at the dekstop but why should I care when I'm sitting 10ft away on the couch with a 65" tv?
there isn't much room for graphic improvement anymore and RT will change a lot, framerate truly matters now
I remain optimistic that consoles will look similar to PC games with almost no differences
TFLOPS is nearly meaningless for comparing graphics performance, GCN cards always had great TFLOPS figures but them being good GPGPUs didn't translate to being good GPUs
Until the consoles are available for testing by independent third-parties, we cannot tell. Going by those spec sheets alone, the Xbox Sex should be somewhere between 10-20% faster in games. However, going by spec sheets alone is as foolish as "car X has 8 wheels, car Y has 4 wheels, therefore..."
>mid-tier PCs (1660s and 570s)
you what, anything that isn't RTX, 57xx or their Pascal equivalent is automatically falls in the low end category, sorry.
so is keeping RT stuff inside CU a good thing or not?
>1080ti I bought 18 months ago is now low-end
Fuck
PC gaming in a nutshell.
A 2080 and a 8c cpu is definitely high end.
Unless you are trolling that you believe the other guy: 1080 Ti is still a very powerful card with high-end performance.
is it same tensor as nvidia or not?
Not as RAM no, but as a dedicated cache, sure. If you need to change up to gigabytes of data in RAM, say if you have a MS flying game, where the open world is in petabytes, it sure as hell would be handy to be able to have 256 gb of super high speed cache, instead of depending on the network.
>their Pascal equivalent
>the cheapest pci-e SSDs are already fast enough to make the CPU choke
Maybe if you have an Intel CPU.
The Xbox will have a feature where you can suspend a game's RAM to the SSD and resume another game the same way. That will greatly benefit from a fast SSD.