Whether that gets in the way of you having a good time on consoles depends entirely on what kind of things you like to build and how far Intercept will be able to push optimizations. Just want to see if we can relate the conversation somewhat to those questions.Ī good PC will be head and shoulders above consoles in terms of CPU performance, so you'll always have better experience in KSP or KSP2 on decent gaming PC vs consoles. To everyone who does, how do you think KSP 2 would run on a PC woth good specs and a X Box Series X? How long would it take to port. But i think a large portion of people are going to be very disappointed with them regardless when they realize they aren't magic boxes that can do 4K Raytracing at Ultra settings with the framerate locked at 60FPS. It's a shame really, because the PS5 and Xbox Series X are a legitimate leap over the previous consoles. It's pretty clear ATM that they don't have the horsepower for Ray-Tracing on anywhere near the scale marketing has sold people on. Well they will hit hard limits at that point, and this is all assuming purely rasterized scenes. 4K with checkerboard rendering might even be an option on lesser titles, but if they try to do 120FPS with high resolution. If they go for 1080P or 1440P, then i think the PS5 and Xbox Series X will be able to hold decent frame rates and high graphics settings. Then cranking the resolution makes sense, shifting the bottleneck more to the GPU than CPU.īut i do recall Jaguar having poor Cashe performance, and AMD's architecture overall at that time suffering from poor cashe hit to miss ratios and absolutely disgraceful branch prediction performance (This might be one of the areas that was a Regression from the old Thubans, but I'm not 100% on that). That's one of the major reasons they decided to chase resolution instead, when you have such a unbalanced system like that where the CPU just isn't capable of feeding the GPU rapidly but the GPU has plenty of horsepower on tap. But the Minimum frame-times were exceptionally poor, and rarely could the console hit a stable 60 FPS even at 1080p. Well they also went from (At least with the PS4 Pro) from a Radeon HD 7750 equivalent GPU core in the base PS4 to a GPU core resembling a Radeon RX 480, and kept the same CPU just with slightly higher clocks.Īnd the result was the same as if you made a similar PC, the maximum frame times improved (But no where near the maximum improvement they could've achieved with better CPU). Meaning your pipeline is constantly starved. Worst of all, again, compared to compute capabilities, cache performance lagged dramatically. Remember when these just came out, and it looked like it solved all the problems, because the numbers were better across the board and games built for the original systems ran pretty well on these? That didn't last long, though, as everyone expected better graphics on updated versions as well, and we quickly went back to the same problems we've always had. And while I haven't had hands-on with next gen yet, it sounds from specs like it might be a similar kind of story: CPU that's technically powerful enough to feed your GPU pipe from purely compute perspective, but starts to stutter the moment you are trying to do anything fancy.Īnd, I mean, yeah, it's a definitely an improvement over Jaguars of the PS4/XBOne, but so was the Evolved Jaguar on PS4 Pro/XBOne X. The biggest bottleneck was cache performance. They might not be clocked high, but it's still better than the last generation which had garbage IPC and poor clocks.Ĭlocks aren't even half the story. The PS5 and Xbox one X are packing Ryzen, they're sooooo much better than the last generation on the CPU front.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |