Seto Kaiba Posted Tuesday at 08:31 PM Posted Tuesday at 08:31 PM 2 hours ago, azrael said: The Core 9 285K is a mixed bag for gaming. It does about as good as the 14900K; anywhere from 1-5% uplift from the 14900K. So at least a modest improvement over the 13900K without the tendency to die messily? That sounds A-OK to me. 2 hours ago, azrael said: There aren't any major drawbacks (it's not going to self-destruct like the 13900K and 14900K) but it's not a performance topper either. 36 minutes ago, pengbuzz said: I think at this point, reliability might be a bigger draw than a top-performer that chokes and burns itself out with dismal regularity. "Not going to self-destruct" is definitely a strong point in its favor... I've gone through the tedious RMA process for the 13900K before, and when the second one died despite having the microcode fixes I figured it was time to upgrade since the loss of a calendar quarter's productivity to the glacial Intel warranty process just wasn't in the cards. 8 minutes ago, mikeszekely said: Kinda depends on what you're running, I'd think, if it's more CPU-bound or not. My setup came with an Ultra 7 265KF, and 99% of the time I'm fine. I did notice playing Assassin's Creed Shadows, though, that I'd get some dips in towns where the CPU was driving a bunch of NPCs. It's kind of a 50-50 between gaming and work, with the latter now involving some software development, network performance simulation, multiple concurrent virtual machines, and some nonsense with LLMs because it's trendy so management wants it involved somehow even if it makes zero practical sense. I'm kind of expecting the LLM stuff and embedded network performance simulations to bottleneck the CPU as hard or harder than any gaming I might be doing. Most of the games I play are pretty old... aside from Phantasy Star Online 2: New Genesis, Overwatch 2, and such I've mainly been replaying much older games like the Dishonored trilogy, Bioshock trilogy, and the RTX upgrades of Quake and Quake II for nostalgia's sake... the latter of which are practically in pocket calculator territory these days. Quote
jenius Posted Tuesday at 08:39 PM Posted Tuesday at 08:39 PM I do every other build AMD and Intel, my most recent is a Core 285 and it's been very good for creating tasks. My encode speed is a huge improvement over my last computer... But that's to be expected. The build I gave my kids is a 7950 and it acts kinda funny... Random heat spikes that kick all the fans on that disappear as soon as they arrive... Haven't figured out what that's all about but found plenty of people describing it as normal which is disconcerting. Quote
TangledThorns Posted Tuesday at 08:40 PM Posted Tuesday at 08:40 PM 3 hours ago, Seto Kaiba said: Random Q for the performance/gaming aficionados here... have you any thoughts on the Intel Core Ultra 9 285K as a processor for a gaming rig? I know Intel's kind of on the outs with that community due to the issue with the 13900 and 14900, so I was wondering if that chip has any major drawbacks. For gaming go straight to 9800X3D. On a budget, then 9600X. 9600X3D is supposedly going to be released by the holidays too. https://www.tomshardware.com/pc-components/cpus/amd-ryzen-9-9950x-vs-intel-core-ultra-9-285k-faceoff-it-isnt-even-close Quote
Test_Pilot_2 Posted Tuesday at 08:41 PM Posted Tuesday at 08:41 PM 6 minutes ago, Seto Kaiba said: So at least a modest improvement over the 13900K without the tendency to die messily? That sounds A-OK to me. "Not going to self-destruct" is definitely a strong point in its favor... I've gone through the tedious RMA process for the 13900K before, and when the second one died despite having the microcode fixes I figured it was time to upgrade since the loss of a calendar quarter's productivity to the glacial Intel warranty process just wasn't in the cards. It's kind of a 50-50 between gaming and work, with the latter now involving some software development, network performance simulation, multiple concurrent virtual machines, and some nonsense with LLMs because it's trendy so management wants it involved somehow even if it makes zero practical sense. I'm kind of expecting the LLM stuff and embedded network performance simulations to bottleneck the CPU as hard or harder than any gaming I might be doing. Most of the games I play are pretty old... aside from Phantasy Star Online 2: New Genesis, Overwatch 2, and such I've mainly been replaying much older games like the Dishonored trilogy, Bioshock trilogy, and the RTX upgrades of Quake and Quake II for nostalgia's sake... the latter of which are practically in pocket calculator territory these days. I think you've already solved the use case then... anything modern, even mid-grade annihilates the frame rates on older titles. Under these conditions I would stick to reliable performance as opposed to beta testing hardware and serving as a customer service incentive statistic. Quote
Hikaru Ichijo SL Posted Tuesday at 08:50 PM Posted Tuesday at 08:50 PM (edited) 13 hours ago, azrael said: What's your budget? The only boards that seem to still have 5.1 audio are the Asus TUF Gaming B850 line. One way you might be able to get around the lack of 3.5 audio jacks is to use a DAC and pipe the audio that way. I mentioned this way back ☝️but most folks are no longer using 5.1 systems because more folks are using headsets and a good pair of monitors(speakers) do much better than a 5.1 system. Yes, some people like the immersive sound, but in most current markets, folks are just not using multi-speaker sets anymore. 2.1 systems are more than enough. Whether you go with a 5070ti or wait for the Supers is up to you. The Supers, at least according to leaked specs are the same GPUs but with more and higher clocked memory. The Supers will get you farther but there is no real spec bump beyond memory. Nvidia is lowering prices of current GPUs to make room on shelves for the Supers. Rumor has it they might be the same MSRP-ish (we all know MSRP doesn't exists with any GPU). So if you buy now, you might get it for a decent price (i.e. MSRP 😱). If you wait, you will get a slightly better card for roughly the same price...roughly. My budget is about $1750. The 5070ti has been available at $750 (msrp) for a while now. This budget is just for the pc itself. I will hold off until I see what the supers cost. Edited Tuesday at 08:58 PM by Hikaru Ichijo SL Quote
Hikaru Ichijo SL Posted Tuesday at 08:58 PM Posted Tuesday at 08:58 PM 10 hours ago, TangledThorns said: I recently built my first PC in over ten years too. Went with 850M mATX as I don't think we need larger systems any more since almost everything is built into the mobo or can be installed on it, namely NVME drives. Going 4K means more pixels which means more power in GPU, minimum is a RTX 5090 these days imo. Recommend going for 1440p which is a nice medium with OLED displays. See my recent post in this thread on what I built. Also, take your time do a ton of research before making your purchase. There is a wealth of Youtube videos that helped me in my decision and build plans. I think I spent over two months researching before I settled on my build parts. EDIT: I found Christopher Flannigan's videos to be easy to watch and learn from for my new build. https://www.youtube.com/@ChristopherFlannigan/videos More edit: Recommend MSI mobos right now, they seem to be leading the pack when it comes to stability and features. Don't cheap out on important no name parts like the PSU and fans. Check out ID Cooling for cooling parts, they're way more affordable than the competition too. Last, skip the RGB crap. RGB does nothing for your system's performance and can add frustration if it doesn't work right. Learned that from my old system and skipping RGB for my new build made it soooo much easier. I agree 100% about Rgb it is just tacky. I am going with an ATX systems. I still use a 8tb hard drive. 6 hours ago, Test_Pilot_2 said: I would recommend aiming for a nice 1440p monitor that works at a high refresh rate. I'm also seeing better prices on packaged, pre-made PCs than parting out. Just find a brand that is friendly for DIY upgrades. You'll get more bang for your buck and you don't get sucked into the 4k/8k bs. High-refresh 1440p with all of today's graphics tech is fantastic. Also much more affordable. I game on a 150" 4k/1440P projector that runs at 144hz. Even at that size the smoothness and additional bells and whistles I can run at 1440p make the experience much better than 4k freesynced to anything 60-90hz. Maybe I should go 1440p instead. But then do I need a 5070ti or will a lower end video card still be able to handle a 144hz screen. 47 minutes ago, mikeszekely said: Pretty much this, but I think one of the biggest mistakes I made was buying a 4k 60hz display back in the day. Granted, it depends a lot on whether you prefer higher frames or higher fidelity, but I today I have an RTX 5080 and a 240hz display that runs at 32:9 5120 x 1440, which is double the pixels of a standard 16:9 1440p display but still roughly a million pixels less than a 16:9 4k display. I play mostly single player games and I like to turn the graphics up; I average between 60-90 fps in most games, and occasionally hit the 120 mark. Despite using different cables, both HDMI and Display Port, I had weird glitches with the desktop running at 240hz and wound up setting the display to run in 120hz mode. Before, with an RTX 2080, I had to use DLSS and turn down a lot of settings to stay in the 45-60fps range at 4k, or if DLSS wasn't supported it was a choice between looking good at 1440p (or even 1080p) or looking like butt at 4k. Kinda depends on what you're running, I'd think, if it's more CPU-bound or not. My setup came with an Ultra 7 265KF, and 99% of the time I'm fine. I did notice playing Assassin's Creed Shadows, though, that I'd get some dips in towns where the CPU was driving a bunch of NPCs. I might just go 1440p. I Quote
Seto Kaiba Posted Tuesday at 09:28 PM Posted Tuesday at 09:28 PM 44 minutes ago, Test_Pilot_2 said: I think you've already solved the use case then... anything modern, even mid-grade annihilates the frame rates on older titles. Under these conditions I would stick to reliable performance as opposed to beta testing hardware and serving as a customer service incentive statistic. Yeah, my primary concern was whether the 285K was going to be another RMA queen... and my secondary being whether it'd be inferior to the 13th gen chip it's replacing. Nothing on the bleeding edge has really grabbed me, but in the event something does I'm hoping the 285K will at least not be horrendously underpowered. Quote
azrael Posted Tuesday at 09:33 PM Author Posted Tuesday at 09:33 PM 24 minutes ago, Seto Kaiba said: It's kind of a 50-50 between gaming and work, with the latter now involving some software development, network performance simulation, multiple concurrent virtual machines, and some nonsense with LLMs because it's trendy so management wants it involved somehow even if it makes zero practical sense. I'm kind of expecting the LLM stuff and embedded network performance simulations to bottleneck the CPU as hard or harder than any gaming I might be doing. Most of the games I play are pretty old... aside from Phantasy Star Online 2: New Genesis, Overwatch 2, and such I've mainly been replaying much older games like the Dishonored trilogy, Bioshock trilogy, and the RTX upgrades of Quake and Quake II for nostalgia's sake... the latter of which are practically in pocket calculator territory these days. If AMD is in play, maybe look at the non-X3D CPUs. Because this scenario is a 50/50 work/play, I'd probably lean Intel for the core-count since you mention LLMs. I'd mentioned Threadripper but that's probably outside of anyone's budget. 24 minutes ago, jenius said: The build I gave my kids is a 7950 and it acts kinda funny... Random heat spikes that kick all the fans on that disappear as soon as they arrive... Haven't figured out what that's all about but found plenty of people describing it as normal which is disconcerting Starting with the Ryzen 7000-series, AMD change their CPU usage so that it jumps to 95C as the CPU comes under load. Go and watch reviews when the 7000-series came out. It came up as a talking point because the CPUs would immediately spike to 95C and stay there before dialing up or down. We're use to the traditional ramping up as load increases not the immediate spike then hold. 12 minutes ago, Hikaru Ichijo SL said: Maybe I should go 1440p instead. But then do I need a 5070ti or will a lower end video card still be able to handle a 144hz screen. I might just go 1440p. I If you are planning on doing 32", I'd do 4k. 32" 1440p might feel like you are looking at you 24" 1200p but oversize. The PPI of a 32" 1440p is 94ppi. That's about the same as that 24" 1200p screen you have. The idea sizes are with 100% display scaling are 24" @ 1080p 27" @ 1440p 32" @ 4k When you go outside these ranges, things look... weird. I use 24" @ 1440p and 27" @ 4K resolutions for work and while text is absolutely clear, it's also looks eye-straining small (like, I should not be looking at these screens with this high of pixel density for as long as I do if I want to preserve my eyesight), especially 27" 4K. Quote
Test_Pilot_2 Posted Tuesday at 09:35 PM Posted Tuesday at 09:35 PM (edited) 13 minutes ago, Seto Kaiba said: Yeah, my primary concern was whether the 285K was going to be another RMA queen... and my secondary being whether it'd be inferior to the 13th gen chip it's replacing. Nothing on the bleeding edge has really grabbed me, but in the event something does I'm hoping the 285K will at least not be horrendously underpowered. I think the days of PC master race power are gone. Aside from some limited developers going full Crysis now and then just about everything is tuned to where the consoles thrive.... It is a sad, sad state of affairs anymore. It may only get worse going forward. Anything tuned at or a little above the current consoles should thrive for a while... go 1.25 times performance over that and you're good for a long while. When does the PS6 release? It may still lack the performance of today's peak pc hardware... Edited Tuesday at 09:41 PM by Test_Pilot_2 Quote
Test_Pilot_2 Posted Tuesday at 09:40 PM Posted Tuesday at 09:40 PM 2 minutes ago, azrael said: If AMD is in play, maybe look at the non-X3D CPUs. Because this scenario is a 50/50 work/play, I'd probably lean Intel for the core-count since you mention LLMs. I'd mentioned Threadripper but that's probably outside of anyone's budget. Starting with the Ryzen 7000-series, AMD change their CPU usage so that it jumps to 95C as the CPU comes under load. Go and watch reviews when the 7000-series came out. It came up as a talking point because the CPUs would immediately spike to 95C and stay there before dialing up or down. We're use to the traditional ramping up as load increases not the immediate spike then hold. If you are planning on doing 32", I'd do 4k. 32" 1440p might feel like you are looking at you 24" 1200p but oversize. The PPI of a 32" 1440p is 94ppi. That's about the same as that 24" 1200p screen you have. The idea sizes are with 100% display scaling are 24" @ 1080p 27" @ 1440p 32" @ 4k When you go outside these ranges, things look... weird. I use 24" @ 1440p and 27" @ 4K resolutions for work and while text is absolutely clear, it's also looks eye-straining small (like, I should not be looking at these screens with this high of pixel density for as long as I do if I want to preserve my eyesight), especially 27" 4K. I'm at 150" 1440p from 12-13 feet away. Looks spectacular. My desktop is a 34" ultra wide @ 1440p, also looks spectacular. Maybe that's a little subjective though, but 24-27" just seems small anymore. x3D processors are fantastic. Quote
azrael Posted Tuesday at 09:49 PM Author Posted Tuesday at 09:49 PM 4 minutes ago, Test_Pilot_2 said: I'm at 150" 1440p from 12-13 feet away. Looks spectacular. My desktop is a 34" ultra wide @ 1440p, also looks spectacular. Maybe that's a little subjective though, but 24-27" just seems small anymore. x3D processors are fantastic. At relative distances. The average viewing distance of desk workers is ~2 feet from the screen. Quote
TangledThorns Posted Tuesday at 10:06 PM Posted Tuesday at 10:06 PM 49 minutes ago, Hikaru Ichijo SL said: I agree 100% about Rgb it is just tacky. I am going with an ATX systems. I still use a 8tb hard drive. A single or multiple HDDs? mATX cases can hold one or more HDDs Quote
Hikaru Ichijo SL Posted Tuesday at 10:32 PM Posted Tuesday at 10:32 PM 25 minutes ago, TangledThorns said: A single or multiple HDDs? mATX cases can hold one or more HDDs just 1 8tb. Quote
TangledThorns Posted Tuesday at 10:38 PM Posted Tuesday at 10:38 PM 3 minutes ago, Hikaru Ichijo SL said: just 1 8tb. Oh then you should be fine for most if not all mATX cases then. If you like adding more capacity there is mATX mobos like my MSI MAG Mortar Wifi 850M mobo that has three NVME m.2 slots. Quote
Seto Kaiba Posted Tuesday at 11:31 PM Posted Tuesday at 11:31 PM 1 hour ago, Test_Pilot_2 said: I think the days of PC master race power are gone. Aside from some limited developers going full Crysis now and then just about everything is tuned to where the consoles thrive.... It is a sad, sad state of affairs anymore. It may only get worse going forward. Anything tuned at or a little above the current consoles should thrive for a while... go 1.25 times performance over that and you're good for a long while. When does the PS6 release? It may still lack the performance of today's peak pc hardware... To be fair, the distinction between a PC and a game console has been shrinking for a long long time now. I remember when the PS3 dropped and half my friends were excited not for the games but because they heard you could load Linux on it and use it as an ad hoc PC. 😆 (We're engineers, it's just how we're wired.) Proprietary dev kits and licensing hassles aside, developing for a console at least offers some stability in terms of the limitations of hardware and hardware variations so I can definitely see the appeal. Especially after years of working with embedded control systems that wish they had even a fraction of a current gen console's oomph. From what I've heard, it sounds like there's a shakeup in the games industry itself that's driven some developers into corners. Mainly the AAA envelope-pushing and live service model isn't paying off like it used to, though since I haven't had a ton of time for gaming I've mainly heard it secondhand in the form of grousing games journalists. 1 hour ago, Test_Pilot_2 said: I'm at 150" 1440p from 12-13 feet away. Looks spectacular. My desktop is a 34" ultra wide @ 1440p, also looks spectacular. Maybe that's a little subjective though, but 24-27" just seems small anymore. Maybe a bit... it really depends on your needs. As I understand it, a lot of the FPS "esports" types favor 240Hz monitors in the 24-27" size range because it's easier to see the whole screen without needing to turn your head. I know a lot of folks who still run monitors in that size, though either because of space constraints on the desk or because they're running a multi-monitor setup. (I'm running three 27" 1440p ROG Swifts in my home setup, for instance, with the outer two on gas shocks so I can spin 'em around as needed.) Quote
Seto Kaiba Posted Tuesday at 11:33 PM Posted Tuesday at 11:33 PM Another random question for anyone who might have flirted with this tech... Has anyone out there tried, or had any success with, using a wireless HDMI adapter to plumb a TV into your PC for conference room-type stuff, streaming, or gaming? Quote
TangledThorns Posted Wednesday at 06:09 PM Posted Wednesday at 06:09 PM (edited) Pixel 10 announced today. Preordered a Pixel 10 for free by trading in my three year old Pixel 7 with an aging battery. Couldn't pass it up! EDIT: This was through Verizon. Edited Wednesday at 08:52 PM by TangledThorns Quote
TangledThorns Posted Wednesday at 08:55 PM Posted Wednesday at 08:55 PM 2 hours ago, jenius said: Where is that deal available? Sorry, I should have mentioned it was through my carrier Verizon. I had to trade in my Pixel 7 and upgrade to an Unlimited Plan which was about the same price and features as my current plan. Quote
JB0 Posted Wednesday at 09:50 PM Posted Wednesday at 09:50 PM On 8/19/2025 at 11:47 AM, Seto Kaiba said: Random Q for the performance/gaming aficionados here... have you any thoughts on the Intel Core Ultra 9 285K as a processor for a gaming rig? I know Intel's kind of on the outs with that community due to the issue with the 13900 and 14900, so I was wondering if that chip has any major drawbacks. Personally, I do not trust Intel right now. They have overtly lied so hard about the issues with their previous processors I have no faith in their undocumented assuraces that the oxidation problems have been addressed, and their microcode fixes clearly aren't solving the overvolting problems(can't fix hardware with software). Right now, I wouldn't pay for ANY Intel part. Stats-wise, the 200 series has proven underwhelming. Even Userbenchmark's notoriously wild Intel bias can't generate anything good to say in his 285K writeup, or even find room to throw in an unrelated paragraph of how much AMD sucks. I'm genuinely worried about the poor guy. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.