Here's your PS5 bro.
A. FUCKING. VCR
Other urls found in this thread:
youtube.com
cpu-monkey.com
cpu-monkey.com
gamersnexus.net
techinsights.com
eurogamer.net
reddit.com
igorslab.de
community.amd.com
twitter.com
How fucked is Sony right now?
that reads like a retards first venture on Yas Forums who thinks he knows what hes talking about
>Jeff Rickel
......who?
PROTIP: the xboner x will also have severe overheating and throttling problems. It's been confirmed it uses a 315W PSU which only gives them about 190W of usable power for the CPU, GPU, SSD, motherboard, and GDDR6 vRAM.
Isn't a console supposed to occupy the exact same physical space and location as a VCR did 20 years ago?
>proprietary "gaming" consoles
>they are literally black boxes
you cant make this shit up
Less volts = better temps.
The cooling solution in their render doesn't have a high speed blower fan required for 300W of cooling so the entire thing will constantly overheat and thermal throttle.
BUT because they're not using a 600W PSU they don't have 300W+ of usable electricity so they'll downclock significantly instead.
The TDP of a 5700XT which only has 40 CU with a base of 1605MHz and 8GB of GDDR6 is 225W which is why a PSU of at least 550W is recommended for it to make space for up to a 95W TDP CPU.
All specs point to the same 7nm DUV process being used for APU which is the exact one used by the 5700XT.
Go back to Yas Forums
> you penis.
how am I supposed to know about this vcr thing meme? fucking redditors, I swear
Right?!?
That is flagrant intellectual property theft from Apple.
>It's been confirmed it uses a 315W PSU which only gives them about 190W of usable power
What the fuck is this even supposed to mean? It's either 315W or 190W. There's no such thing as a "315W" power supply which is only capable of 190W output, that's literally just a 190W power supply.
It's going to be the same story as last year. Pre-launch hype is going to get millions of morons to buy them on day 1, actual tests weeks later are going to show things like pic related.
The best part is they're LITERALLY telling you it's going to suck fuck on the spec sheet but no one is actually going to reap it past "peak TFLOPS".
what's your prediction for the performance of both consoles?
what parts will a PC need to have equivalent performance?
>a fucking VCR
Based tho
>BUT because they're not using a 600W PSU they don't have 300W+ of usable electricity so they'll downclock significantly instead
What did this Yas Forumstard mean?
A leading expert:
>Jeff Rickel - Database Administrator - Autism Speaks | LinkedIn
the spec sheet is pretty good, do you not keep up with hardware
Components like GPUs and CPUs don't actually have fixed power consumption, they have power spikes which typically go twice their rated TDP. A PSU must be able to handle these power spikes otherwise it will shit off as going past their rated wattage can blow capacitors thus killing the PSU. The most you can use from a PSU is actually 60% of it's rated capacity to account for these power spikes.
So with only 190W of usable power at 60% 315W PSU load, even if we were to assume the CPU somehow magically only used 45W like the 3.2GHz base 8-core laptop zen 2 chips then that leaves less than 150W for the GPU, GDDR6, motherboard, SSD, and the GPU itself.
>The most you can use from a PSU is actually 60% of it's rated capacity to account for these power spikes
Wrong. PSUs are rated for their sustained power output. A 600W psu is rated to maintain 600W of power to components i.e. actual power draw due to energy losses will make it draw >600W. At the same time, the peak power it can push is easily 800-900W for brief periods.
Video games are so boring now.
>the peak power it can push is easily 800-900W for brief periods.
*That's only assuming that that amount of power doesn't trip the OEM's configured OCP.
A bit more copper fins and pipes and all heat issues go away.
Abysmal. It's using a cut down renoir laptop zen 2 CPU that doesn't turbo to 4.2 to 4.5GHz like the desktop variants. 1.7GHz GDDR6 is being used to run infinity fabric with very high latency.
With 145W left after the CPU and a good chunk of that being used for everything else I'm not sure if a 52CU GPU with 16GB of GDDR6 would even be able to run at 1GHz.
Outside of really well optimized titles I don't see this console being able to run games any faster than a PC with a 2600 + Rx 5700.
Looks good to me but I'm no zoomer so what do I know about aesthetics.
it's an interesting thought
They're not, once you go past it's rated wattage the PSU will automatically shut down. If it's a chink one that doesn't have this safety feature then the capacitors will pop and cables will melt if not start fires.
Go buy an evga 400W PSU and fit it on a rig with a 295W TDP vega 64 and 95W 3800X and run both on full blast. What could possibly go wrong?
No
>A 600W psu is rated to maintain 600W of power to components
I doubt the cheap PSUs they have to put into consoles have such capacity
>Go buy an evga 400W PSU and fit it on a rig with a 295W TDP vega 64 and 95W 3800X and run both on full blast. What could possibly go wrong?
That's a really, really bad example. That 295W v64 and 105W 3800x TDP values DO NOT include power losses from vrms. Their actual power draw is easily >1.1x of rated TDP.
>They're not, once you go past it's rated wattage the PSU will automatically shut down
Once again, wrong. Pic rel says corsair, but as far as I'm aware, all reputable brands rate PSUs based on continuous power. Furthermore, modern PSUs have numerous protection features unless you buy a dollar store psu. Theres overpower protection, overcurrent protection (usually referred to as 12v rails) and there's overheating protection. They don't just explode when you exceed its rating lmao
I would buy that. VCRs are sublime.
You can buy that, it's some panasonic vcr
It's literally what happened to the one X and why performance drops to under 20 FPS as shown in
It has a 245W PSU but yet irl it only pulls 175W from the wall and only 80% of that gets converted to usable DC for the 12V rail or 140 watts which is close to the 60% load max on a PSU.
What corsair was testing was a FIXED 600W load on their PSU with no 1,200W peaks which is what will happen if you put 600W of combined TDP in PC. That HQ 600W PSU still can't be used for a 295W TDP vega 64 and one of intel's 300W TDP xeons.
even tho i dont give a fuck about consoles its still concerning how fucking stupid you all are. dont you guys notice how youre all doing free marketing for both consoles? it doesnt fucking matter what renders or supposed hardware specs they publish what actually matters is the product thats being released at the and. you fucking redditors are spending hours on disscussions about products that you cant even buy yet. sony and microsoft are playing all of you like puppets and you dont even notice. wait until they release the consoles for fucks sake, then go to Yas Forums and discuss that shit over there
pubg runs like trash regardless, not an incredible example
Let's hope that they don't fuck up the only good thing about the PlayStation - the controller
Why they dont use the more efficient Nvidia gpu's?
because nobody likes working with nVidia
nvidia GPUs are not more efficient than RDNA2
because nvidia doesn't make x86 apus retard
if they did, they won't be able to sell them at affordable price
youtube.com
It runs just fine on a dual-core processor and an Rx 570 both which are vastly inferior to the 8-core processor and RX 580 level GPU on the one X.
What actually happens irl: Less than 100W actually go to the GPU causing the core frequency to tank way below 1GHz possibly down to 600-700MHz and thus you can't even get 20 FPS anymore. At best the Xbox one X is on par with a GTX 1050 PC and I'm being very generous.
I'm surprised at the performance honestly, pubg doesn't run well in my own experience with different hardware, though my ram is worse
That's nothing compared to a console using vRAM as system RAM. It SOUNDS like a good idea at first, the thing is bitching fast in transfer speeds. Until you look at random 4KB QD4 speeds and latency.
What RAM are you using? Have you tried tightening the timings? PLEASE don't say you're using only a single stick...
right now I'm using 2 8gb ddr3 sticks at like 1060mhz, don't know that I can get much faster with my CPU, using a xeon x5675 and r9 390 gpu
haven't tried pubg since putting the new cpu in, previously had i7 930 @ 3.7ghz and could only hit like 40-75 fps in pubg
>implying ps4 and xbox one don't already look like VCRs
Gddr6 would run infinity fabric at a very low latency because the bus scales with memory throughput
Lol yeah that RAM is really doing a number on pc gaming. It's probably better if you jump to a 1600 AF + Rx 5500XT system with at least 2666MHz RAM. Intel's first i-core series had IPC on par with FX processors from AMD so that xeon is operating closer to a 1600 locked at 2GHz.
It drops below 20 fps because PUBG was a single threaded game with 100 players in game you retard. It's hard capped by the slow jaguar based processors in the last gen consoles. They're literally 1/4 the speed of a zen 2 core at 3.6ghz. Less than half the frequency and about half the perf/hz. Jaguar/puma was designed for ultra mobile and tablets
yeah 1600AF looks really appealing cost/perf-wise, I bought this xeon like 2 years ago though
That's assuming that GDDR6 has the same random QD4 4KB throughout as actual system RAM. You ever thought about that m8, why instead of using "slow" RAM we don't just have GDDR5 modules? They cost about the same, faster is better, right?
see
The CPU is actually perfect for pubg since even the day 1 release had good performance gains from a 2 logic. core CPU to a 4 logic. core one. Later this performance even got bumped up to 8 logic. core. If the console had used high speed DDR3 RAM, 500W PSU, and significantly more cooling it would be able to compete with $400 200GE + Rx 570 brand new PCs of today.
The /cake/-lite board telling a tech-related post to "get out"
You literally do not understand what you're talking about. The big number you read on the power supply box is the sustained load rating (at least for any reputable brand it is), not the rating for extremely short peaks or the maximum instantaneous power output (which is always higher than the sustained rating but never really specified for computer PSUs because it's largely irrelevant). Extremely short spikes will mostly be buffered by the output capacitors which will usually be sized appropriately in order to handle power spikes from computer components which have an average power draw equal to the unit's rated sustained output. You're acting like you've uncovered some big secret because you found out about power draw spikes, but everyone who actually designs any electronics is well aware of these things and they are taken into account in any good power supply design.
>vastly inferior to the 8-core processor
The CPU in the Xbone is absolute, bottom-tier garbage from 2013 that was initially designed for extremely low power mobile/tablet applications. It's extremely shit, which is why PUBG runs like complete crap on console, it was initially designed for PC CPUs which have vastly superior single-threaded performance and at the time PUBG released 8C CPUs were not popular at all. A single core in that G5400 probably performs as well as 3+ Jaguar cores at console-tier clocks too, Jaguar is just pure shit. No console game would run that poorly due to GPU constraints because easing up on GPU constraints is very easy in most cases: just render at a lower resolution and just like that you can maintain your 30FPS target. It's CPU constraints that are hard to deal with, because those can be caused by engine design limitations or game mechanics, so you can't just lower some number like resolution and increase frame rate.
Why would you go on the internet and lie about things you don't know shit about?
Westmere can easily do 2133MT/s+ with the right ram. Quick and dirty OC can do wonders.
>IPC on par with FX
Do you have a source for that?
cpu-monkey.com
>cinebench r15 mt 122
>4 core jaguar
cpu-monkey.com
>cinebench r15 mt 385
>2 core coffeelake
The dual core is literally over 3x faster
PUBG doesn't scale at all past 4 cores so it's plain to see why the 2 core coffeelake runs it better than the 8-core jaguar in the X1X
AMD provides a package deal of CPU & GPU at a reasonable price & power output that no one can really touch right now. They also provide easy backwards compatibility. Sure Nvidia could make a moderately better GPU, but the best CPU they could offer would be some overclocked ARM units; the Ryzen chips going into these new consoles are as good as it gets at these power usage levels. Now, they could have paired a Ryzen CPU with Nvidia GPU, but that would have been more expensive.
Who cares, there weren't decent games since BB.
No shit, this is not my point
>They cost about the same, faster is better, right?
No, they don't
gamersnexus.net
>The next question is what GDDR5 costs. A recent DigiTimes report pegs GDDR5 at about $6.50 for an 8Gb module, though also shows pricing for August onward at $8.50 per module. With old pricing, that’s around $52 cost for an 8GB card, or $68 with new pricing
That's the raw cost of a module.
vs a 12GB, 12 gigaBYTE, package of LPDDR5-5500, which is literally so new only Samsung is manufacturing them right now and represents the cutting edge of mainstream DDR technology, and not only that it's higher end at 5500 MT/s while normal DDR5 is expected to start at 4800
>$44
techinsights.com
Not to mention GDDR5 doesn't scale down at all. You can buy less of it but there aren't low cost versions like with DDR4 going down to 2133. So it's expensive to scale by volume. And you would need a wide, power hungry memory controller to take full advantage of GDDR over its DDR equivalent in the first place, you're not going to use it on a 2-core zen APU. There's also no room for low power variations which are needed for mobile applications
>2080 super + 3700x for $400
Guess my computer is going to have to wait yet another year for it's upgrade.
Literally who larping
>AMD hardware
>Throttling and overheating
Most iconic duo
You forgot the jet engine sound
>no source
kys OP
It's a special variant of the 1600 made on 12nm instead of 14nm which gave a lot of people stable 3.8-4.0 OCs and ability to use 3000/3200 RAM. Use this model number to keep an eye on it, I've seen it drop to $70 on newegg. YD1600BBAFBOX
>the spec sheet is pretty good
Yeah, it's as good as last year's hardware, and that's assuming their GPU wil hold the 2.23GHz
>implying
This is dumb considering a switching power supply has peak efficiency at around 70% load
>Go buy an evga 400W PSU and fit it on a rig with a 295W TDP vega 64 and 95W 3800X and run both on full blast. What could possibly go wrong?
>CPU somehow magically only used 45W like the 3.2GHz base 8-core laptop zen 2 chips
Are you retarded? The 45W laptop chips have a built in GPU, a very powerhungry one.
it still looks better than the xbox series s
I'm actually hyped for the audio processing. Same with XSeX. If that means more devs put some effort to put good HRTF in games, I'm all for it. We need a rebirth of audio on PC.
[citation needed]
This is complete bullshit.
Panasonic a shit, Sony had peak vcr aesthetics
I have no idea how microsoft made a 300w machine the size of a kleenex box. I have no idea how sony will design theirs now.
>thinking you know more than the engineers at microsoft and sony
hahahahahahah im sorry youre so retarded but youre dead wrong, about all of it
wait how big is the series s? is there a size comparison if its tiny i might get one
Not him but my 930 scored about 87 on R15 single, I remember FX CPUs scoring ~100 on the same test. I still had higher IPC but looking back on it, it wasn't much, maybe 10% more. Shame AMD lied to people about the core count, it was an okay quad-core processor for its time especially when it was typically only $150 which was half or less of what i7s cost at the time.
And yet, VHS was more popular and more widely used than Betamax.
my dick
IPC in layman's terms is score per mhz. The i7 930 boosted to 3.1ghz at most. The FX processors went up to like 5ghz
>It's been confirmed it uses a 315W PSU
>The cooling solution in their render doesn't have a high speed blower fan required for 300W of cooling
That's not how system power works.
>The TDP of a 5700XT which only has 40 CU with a base of 1605MHz and 8GB of GDDR6 is 225W which is why a PSU of at least 550W is recommended for it to make space for up to a 95W TDP CPU.
Just stop already. TDP =/= system power.
The CPU portion of the console chips will likely be limited to 50W of power max, probably less considering how Renoir performance is looking at 35W. That leaves a minimum of, according to your confirmed 315W power supply, 265W to run the rest of the system, and consoles never use full spec desktop parts.
>The shape of the console matters
kek
Xbox will never get the japanese exclusives so it doesn't fucking matteer in thee slightest.
Nobody buys consoles because they areee any good, the reason consoles exist is because they hold Bloodborne to ransom.
eurogamer.net
It has a nice chart/photos comparing it to current box
see
This isn't getting ANYWHERE near a 2080 super + 3700X.
Bloodborne only sold 2 million. People buy the playstation for sony movie games made in the US
I'll honestly be surprised if it can even hold 1GHz clocks. 52 CUs at 2.2GHz like 400W TDP which reqquires a 1000W PSU. I don't see a 1000W PSU in any of the specs.
And how would Sony having a better looking VCR than Panasonic contradict that statement?
The TDP doesn't include the iGPU you mongrel, it's just a guarantee that if you run the CPU at base frequency on full blast it won't consume more than 45W. If you run the CPU and GPU at full blast they have to share that 45W which will cause the CPU to throttle down to 2GHz or less.
it literally doesn't matter what stats it has as long as it okay because ppl who buy them only care for games that they support. others build pcs.
see
>others build pcs.
WITH better performance for the same money.
well, yes. you'll always pay premium for already assembled systems. normies don't care and if we're talking vidya neither do i. it's quite comfy if a little boring these days.
Who gives a flying fuck about console performance? I just want my chinese cartoon exclusives and a cozy "just werks" experience that keeps nonfree winshit far, far away from my actual computer.
most who build PCs dont give a shit
Sauce, OP?
i dont see Yas Forumstoddlers flipping their shit there
Quoting your own posts is not an argument.
>Shame AMD lied to people about the core count
Shame people still think FX processors with 8 cores aren't 8-core processors
lmao, no.
consoles are sold at a loss and retail PC parts consumers have always been overcharged
Present a legitimate argument against them.
Look, you all need to chill and stop hyping these pieces of shit up so much. The xbox one x only gets 10-20 FPS in pubg despite having the specs to get you 60-80 FPS like you do with a $400 Rx 570 PC. They're not meant to outperform computers, just do the bare minimum so poorfag retard timmy can play with his shitty online friends and do ebic trolling through voice chat.
LOOK AT THE. SPECS. MEAN. JACK. SHIT.
>THE. SPECS. MEAN. JACK. SHIT.
the previous console generations have not even been close to high-end PC performance though, not even on paper
i dare you to build a PC with better performance for lower price when those things come out. i'd bet good money that you'd be unable to
>Look, you all need to chill and stop hyping these pieces of shit up so much.
Every post you quote in is either pointing out what a fucking retard you are for not understanding how power delivery works or is just talking about the case design.
At least read the posts you've deluded yourself into believing you've BTFO'd.
Here is what this console will most likely perform at taking all GPU power and CPU infinity fabric degradation into account.
5700XT specs: 40 CUs at 1.6GHz with 8GB of GDDR6 vRAM @ 225W (20W is used for the GDDR6 vRAM), ABOUT 5W/CU
52 of those CUs will (not might but will) consume about 300W at 1.8GHz + ~40 watts (maybe more) for the GDDR6 vRAM
8-core zen 2 renoir already consumes 45W at 3.3GHz, remember this is the highest binned renoir flagship from AMD. Cranking up that base to 3.6GHz puts this chip around 65W.
So you have a ~350W TDP GPU and a 65W TDP CPU which AT MINIMUM (literally) requires a 700W PSU.
They're throwing a 300W PSU on these consoles, what do you think is going to happen?
This one won't either.
see If you think me and all of the engineers of AMD/Nvidia are 100% wrong then please go buy a vega 64, 3800X and 400W PSU and then run the CPU and GPU at full blast and tell us how everything was smooth fucking sailing after a few milliseconds.
>base of 1605MHz and 8GB of GDDR6 is 225W
No. Closer to around 100W.
>source: my ass
>If you think me and all of the engineers of AMD/Nvidia are 100% wrong
No thinking required, I know you don't understand what power limits are.
L5640 in daily rig, 4.1G does ~160 1T iirc. For context an FX8350 only boosts to 4.2 stock. Dont have my 8350 on hand but im sure if limited to same core and mem clock nehalem/gulftown/westmere will perform alot better. Afaik bulldozer had closer IPC to K10/Core 2.
Yes and no. 8 integer cores coupled to 4 fp cores and 4 front/back ends. Depends on how you define a core.
No you fucking don't. If you could actually run 400W of combined GPU+CPU TDP with a 400W PSU then nobody would even sell 700/1000W+ PSUs because they'd be a huge waste of capacitors, transformers, and inverters.
>which AT MINIMUM (literally) requires a 700W PSU.
But that's wrong you fucking retard.
Watch more PSU commercials from retarded YouTube, that makes you smart.
350W GPU and a 65W CPU require a 500 PSU assuming the rest of the system is stressed, there's multiple drivees and having some safety margin.
>(not might but will) consume about 300W
TDP isn't actual consumption.
Vega 64 barely makes past 300W on peak stress with a balls to thee wall overclock. In real life Vega 64 is more of a 240W GPU.
Same goes for the CPUs AMD just released a mobile 8 core 4800H or something the console will come with that.
>They're throwing a 300W PSU on these consoles, what do you think is going to happen?
You're going to be embarrassed on the internet retarded zoomer.
>No thinking required
That should be sony and microsoft's new logo desu.
>5700XT specs: 40 CUs at 1.6GHz with 8GB of GDDR6 vRAM @ 225W (20W is used for the GDDR6 vRAM), ABOUT 5W/CU
>52 of those CUs
Not 52 RDNA1 CUs.
>8-core zen 2 renoir already consumes 45W at 3.3GHz, remember this is the highest binned renoir flagship from AMD. Cranking up that base to 3.6GHz puts this chip around 65W.
Total package including 11 hybrid vega CUs pulls 45W.
>So you have a ~350W TDP GPU and a 65W TDP CPU
And you still haven't figured out that POWER DRAW IS NOT TDP.
>A PSU must be able to handle these power spikes otherwise it will shit off as going past their rated wattage
No you retard it isn't, The PSU rating assumes such power behavior, BECAUSE THAT"S WHAT IT WAS MADE FOR.
a 600W PSU is meant to support a system that's spiking all over thee place around the average value of 600W
>If you could actually run 400W of combined GPU+CPU TDP
You don't seem to understand that everybody is calling you a retard because you're making this retarded assumption that the APU TDP is going to be 400W with no evidence beyond LOOK AT WHAT OLDER DESKTOP PARTS THAT ARE EITHER COMPLETELY DIFFERENT SKUS OR A GENERATION BEHIND THE STUFF WE ARE TALKING ABOUT ARE RATED AT.
Renoir shows that a Zen2 chip can run the speeds they're touting at less than 50W of power draw. NOBODY but fucking AMD and Microshaft know what the power draw for RDNA2 based CUs is. You also are not taking into account the power savings in having a simplified IO section of the APU that doesn't have to support 32 PCIE lanes and an extra memory controller.
Best thing is to wait.
Remember the Xbox One was rushed and was a failure. Mandatory Kinect, no game sharing and terrible optimisation killed it from the start.
>8-core zen 2 renoir already consumes 45W at 3.3GHz, remember this is the highest binned renoir flagship from AMD. Cranking up that base to 3.6GHz puts this chip around 65W.
Not sure why anyone is arguing with this lying faggot
reddit.com
>3550mhz
>43W
And the monolithic console die will be 7nm EUV, 10-15% more efficient. We're talking about 37W and this retarded fag is talking about 65
"I know developers and they told me this" people are the worst.
Sony created a prototype to test the hardware and software while developers develop.
It's how I would do it too. Increase clocks to the higher limit and see how many people will report overheating problems.
Then they lower it down and test again.
But I'll be honest this gen of consoles looks like shit
PS5 is zen2 based, you posted zen3 architecture processor
No my peabrained friend. The 3700x is literally desktop zen 2. The PS5/XSX will have Zen 2+ (7nm EUV)
have you even considered that theyre using custom designs that are binned specifically for power saving? also 3.6 peak boost is going to be single core at best assumption, absolutely no reason to assume they need 45W for the cpu especially when you dont even know if this is on the OG 7nm node or if its 7nm+/refined node.
also they design the GPUs overkill with more CUs than production requirements for yields, then they disable the worst/highest power cores. the console chips are a lot more than a souped up 5700xt bolted to a 3700x
also you don't understand how power supplies work at all hahahaha
>. If you could actually run 400W of combined GPU+CPU TDP with a 400W PSU
You can and you should.
Just don't forget to account for the rest of the system, the chipset, memory, main SSD, and extra HDD.
That makes roughly 500W and a 500W PSU will work fine for it.
>hen nobody would even sell 700/1000W+ PSUs because they'd be a huge waste
They are happy to sell you an overpriced brick marketed by a bullshit youtuber.
a 400W PSU is rated for a 400W load and it's supposed to handle a 400W load (and probably 800W split second spikes) .
If your entire system, including the torrent HDDs consumes 400W ON AVERAGE you're fine.
>PS5 is zen2 based, you posted zen3 architecture processor
zen 2 is ryzen 3000 and zen 3 isn't even out yet you braindead monkey
>they'd be a huge waste of capacitors, transformers, and inverters
They make better margins from "high-end" PSUs than they do from "crappy" ones. Those components of are dirt cheap and the difference between the good and bad ones is a few cents. That's why the boutique/custom PC market exists at all despite being like 1/20th the sales of prebuilts or laptops. They make a ton of money per unit sold compared to what they sell in bulk to OEMs like Dell or HP
These are the words from igorslab, NOT MINE:
>"Power supply design and peak loads/currents"
>"As I already proved in detail in my basic article “The fight of graphics card against power supply – power consumption and load peaks demystified”, there are also short-term higher loads in the millisecond range, which can already lead to inexplicable shutdowns in unfavorably designed or not appropriately equipped power supplies. The TBP (Typical Board Power) measured by the graphics card manufacturer or the reviewers alone does not really help here for a stable design of the system."
>"Peaks with intervals between 1 and 10 ms can lead to shutdowns with very fast reacting protective circuits (OPP, OCP), especially with multi-rail power supplies, even though the average power consumption is still within the standard. For this card I would therefore calculate with almost 350 watts for the graphics card as such, in order to have sufficient reserves for the worst-case scenario."
PIc related is a gaming load NOT torture test.
>"there are also short-term higher loads in the millisecond range, which can already lead to inexplicable shutdowns in unfavorably designed or not appropriately equipped power supplies."
I can't quote this enough.
>unfavorably designed
>not appropriately equipped
So if you buy garbage, it may not behave like a well-designed product and may not function as expected. Big surprise!
They literally said it's coming on holidays. Imagine believing in LARPers on youtube.
god damn this looks sick
that's not even how light works
moar = bettar mentality
>So if you buy a console, it will not behave like a well-designed PC of similar price and will not function as expected. Big surprise!*
ftfy
Let's be fucking real here even if these consoles suddenly got upgraded with 700W PSUs and 360mm tripple fan AIOs why would you pay $60 a year just to get the privilege to play online and on average $20-40 more per game? I have a steam library of over 100 games and I paid them all with cash and I don't think I've spent over $200 over the course of a decade. I recently got Doom eternal for $30, it's still like $60-80 for consoles. A $60-80 game that hasn't even been out for a month already dropped to $30 for PC users.
let that sink in.
The only evidence I can find of 5700XT shut downs points to people who already have high end PSUs over 600W, and it happens even playing games like WoW which barely use the GPU due to a CPU bottleneck.
community.amd.com
This guy sounds like he's full of shit
>MUH OVERCLOCKING YOUTUBER
Uh-huh, and JayzTwoCents is an authority on PCs
Consoledrones are just stupid, and let's be honest here, without consoles the stupid would infect the PC gaming world.
let that sink in.
>can already lead to
But does it or does it not? Did he actually crash the system or is it bullshit out of his ass?
I'd love to see an actual experiment on the 80+ certified PSUs like cheaptech or great wall.
>without consoles the stupid would infect the PC gaming world.
Not true. Consolefags who've shifted to PC Gaming become naturally cured from their peasantry and join the rest of the human race.
see I was in a forum thread recently where some jackass kept crying about how AMD is shit and how his computer kept shutting down with his brand new 5700XT. The thread wen't on for about an hour and eventually the little cocksucker admitted that he didn't even know what PSU he had (claimed some 700W at first) because he "wanted to save money" despite rocking a 9700K on a Z390 ATX. This little faggot had a chink 500W PSU that was known to shut off when you stressed it past 80% (400W) for more than a few minutes. The 5700XT already has 350W spikes, the 9700K is known to have 200W spikes even while gaming.
Anyway AFAIK consoles use cheap PSUs since most of the money goes to the CPU, GPU, and memory. While there are expensive PSUs that have OPPs that can handle small 10-20% above the rate PSU wattage they can't do so for very long and they'll still shut off after a while. Most PSUs advertise their OPPs AS their rated wattage so exceeding this will cause an instant shut down. Those are the PSUs going into consoles.
we literally saw gears 5 running at 60 fps at 4k WITH RT ON
thats 2080s perfomance and much more if we remove rt
Read Random shutdowns are a common problem with the 5700XT, it's not power related. It's just garbage AMD hardware and software
Designing a PSU to deliver 300W and have a big enough capacitor at the output isn't such a hard, complicated and expensive task as you want to pretend it is. I'm sure consoles will end up disappointing in multiple ways, as they have constantly for a few generations now, but it won't be due to them not being able to design a fucking power supply for the components they chose to use.
Also DOOM Eternal was like 35 EUR on PC even before release, it didn't drop much, that's just what you could always find it at.
>this kills the itoddler
IMHO if you're planning to build a PC just assume that your PSU can only handle an 60% load at best (ie 300W max for 500W PSU) and double the TDP of your GPU and CPU to account for those millisecond spikes of power. PSUs don't last very long if you run them at high loads, the higher the load the faster all the internal components start to degrade from all the heat (~20% of AC power will be converted directly into heat).
For example say you want to pair a 3700X (~150W power spikes) with a GTX 1660~200W power spikes) look for a 600W PSU. That way you know everything is safe and PSU will probably last well beyond its 3-5 year warranty.
I'm saying that it isn't, unless you go to someone's house and actually stress test their PSU for hours you don't know what they're actually able to handle. You can have a "700W" PSU fail on a 5700XT and a 3700X, hell chinks have probably infected the market with 800W PSU that will shut off after extended 60+% loads
>Here's your PS5 bro.
Reminds me of the show Silicon Valley
>It runs just fine on a dual-core processor
that's why it's shit.
it's un-optimized as fuck.
it makes you wonder that your shitty software runs shit everywhere.
ppl ditched pubg because it has insufferable bugs and they jumped the shit when an even shittier alternative came out. That will tell you how much of a great game this is and how much of a valid argument you make.
30 years ago when the conslows had calculator cpus and the gpu was just another calculator cpu doing math, it was the developer's fault for not making the game work great.
now faggots blame the hardware.
why hasn't this board merged with >>/l/g/bt/ ?
GOD BLESS THE MODS
>some literally who xbox shill
epic
>japanese exclusives
lmfao
as an electrical engineer in find these posts extremely hilarious. this is some fairy tale fantasy land nonsense.
Yas Forums native here, my xbox one x runs a lot of games with significant lag even in newer games like assasin's creed odyssey, I sometimes even get 1-2 seconds where the game completely freezes up. I'm interested in playing doom but now realized I have to actually put in the effort to build my own pc to play it because consoles suck balls. What PC build would you recommend for a first time builder like me? I just want to play doom eternal without it lagging like a motherfucker on my xbox one x.
what happened to the remaining 125W?
amateur larp, easily spotted by any larping connoisseurs like myself
So what igor here is saying about the the 5700XT using up to 350 watts of power is all bullshit? So I really can buy a 400W evga PSU and pair that up with a 2700X and everything will be okey-dokey? Because I'm seriously considering that right now desu senpai. I can get a 400W PSU at my local electronics store for $20 rn.
pc components =/= custom hardware components
i don't know enough about amd graphics cards and their power fluctuations to tell anything reliably. i know nvidia cards are power limited.
DF test shouwed having more CU is better than less CU overclocked
>sony doubles down on "its easier to get higher performance from less lol"
Doesnt matter.
Brand loyalty is all it needs to outsell at first.
>they launch early and end up tarnishing the ps5 name
or
>they launch late and lose all initial sales
Sony is for movies anyways
useless wanker