Jump to content

Recommended Posts

19 hours ago, Knight26 said:

So the network in my house is starting to show some serious issues, the wifi extenders are not working and we cannot easily relocate the modem.

So I am looking at buying my own Cable Modem and getting a wifi-mesh, does anyone have any recommendations that won't break the bank?

If you're running an Asus router, you can try an AI Mesh wi-fi setup with a compatible Asus router and reuse the old router. I tried it but found it to be unstable and went with a conventional extender/repeater config. Your mileage may vary. Another issue was because I was using was the wi-fi backhaul instead of an ethernet backhaul, my AI Mesh wifi speeds were on par with extender/repeater-config speeds (which means my wi-fi speeds were cut in half when connected to the node). My repeater/AI mesh speeds are fine for WFH and even 4k steaming but having stability is more important, especially on meeting calls.

As for cable modems, the Motorola MB8611, Arris Surfboard S33 or the Netgear Nighthawk CM1200 should all work. However, if you are not planning on breaking 1Gb speeds, I'd save money and drop down to a Motorola MB8600 or Arris SB8200.

Link to comment
Share on other sites

  • 3 months later...

Nvidia’s Ada Lovelace GPU generation: $1,599 for RTX 4090, $899 and up for 4080

I think whatever generational and performance gains are going to get drowned out by pricing increases and power requirements (especially if Intel is turning that knob up on power requirements and AMD inching it up as well).

Link to comment
Share on other sites

Yeah, if these power draw figures are anywhere near what the rumors claim, tomorrow's high-end PCs will be demanding more wattage than a frickin' microwave... running for hours on end... every single day...

 

 

From what I've heard, AMD's Radeon 7000 cards will be more focused on power efficiency, but even then their "7900 XT" or whatever it will be called will be drawing up to 400, 450 watts as well, so... I mean, I guess if it can get 85%-ish of the 4090's performance at 70% the power draw, that's certainly better than not, but still. Yikes.

 

Did Nvidia have anything actually interesting on show?

Link to comment
Share on other sites

9 hours ago, kajnrig said:

Yeah, if these power draw figures are anywhere near what the rumors claim, tomorrow's high-end PCs will be demanding more wattage than a frickin' microwave... running for hours on end... every single day...

From what I've heard, AMD's Radeon 7000 cards will be more focused on power efficiency, but even then their "7900 XT" or whatever it will be called will be drawing up to 400, 450 watts as well, so... I mean, I guess if it can get 85%-ish of the 4090's performance at 70% the power draw, that's certainly better than not, but still. Yikes.

Did Nvidia have anything actually interesting on show?

DLSS 3.0 (exclusive to RTX 4000-series), 2x boast in rasterization, 4x performance boost with ray-tracing compared to RTX 3000-series.

Most people are looking at the RTX 4080-12GB as an overpriced 4070-wannabe. Interestingly, the RTX 3080, 3070, and 3060s are still considered as part of the current lineup (giving more weight that there is an overstock of those cards still on the market). AIBs will have to add ~$200+ on top of Nvidia's cards making a non-Nvidia-made 4090 close to $2000.

But yeah, a lot of comments point to people re-thinking whether or not this is worth it. The 4090 now uses as much power as a 3090Ti (with power spikes jumping to ~2-2.5x that amount) and at a $200+ MSRP versus a current 3090Ti AIB board. Plus the cost of a beefier 1000w+ power supply. Unless you want to be a early adopter of the ATX 3.0 PSUs coming out next month which have the required 12VHPWR connector as standard.

AMD announced just before today's Nvidia livestream that RDNA3's announcement will be in November.

Link to comment
Share on other sites

5 hours ago, azrael said:

DLSS 3.0 (exclusive to RTX 4000-series)

Aside from artificial segmentation reasons, I have to wonder if there are actual hardware reasons for the exclusivity. The "frame generation" part, which seems to be basically all that sets it apart from DLSS 2.X, seems... iffy. Frame interpolation has always been a double-edged sword; I'll be interested in seeing how it compares to other, non-AI methods, and how configurable it is (ie whether it can be turned off entirely).

5 hours ago, azrael said:

, 2x boast in rasterization, 4x performance boost with ray-tracing compared to RTX 3000-series.

Looking at some of the slides/graphs, I noticed there was a lot of typical Nvidia number-fudging to get those results. Basically, there were rarely if any actual just straight up head to head performance comparisons. The "4x performance" claim in Cyberpunk 2077 especially and immediately triggered my skepticism switches. And sure enough, when you look at the fine details you see that the base 30-series data is using DLSS 2.X performance mode versus the 40-series using the DLSS 3.0 "frame generation" interpolation.

This sort of stuff is exactly why I don't pay attention to Nvidia press releases, announcements, etc.

Link to comment
Share on other sites

1 hour ago, kajnrig said:

This sort of stuff is exactly why I don't pay attention to Nvidia press releases, announcements, etc.

Or any companies' presentation slides for that matter. Best to wait until reviewers get their hands on it.

My hot take on the RTX 4000 is, unless your wallet is burning a hole through your pants, it's probably not worth it.

Link to comment
Share on other sites

I usually upgrade my GPU every other generation, and since I've got an RTX 2080 I skipped the 3000-series and had been waiting for the inevitable 4000-series.  But, hot dang, those prices.  Does Nvidia think they're still cashing in on the crypto boom or something?  Seriously, before chip shortages and crypto made everything screwy the 80-level was around the price of a new console, not my entire PC budget.  I'll wait until there are some actual in-game benchmarks and see how much of an upgrade a 4070 or 4070ti might be over the 2080, but I have a feeling that I might just sit and wait for RTX 5000 (or, by then, replace my whole PC).

Link to comment
Share on other sites

14 hours ago, kajnrig said:

Yeah, if these power draw figures are anywhere near what the rumors claim, tomorrow's high-end PCs will be demanding more wattage than a frickin' microwave... running for hours on end... every single day...

 

There's a light at the end of the tunnel here. A standard US power socket is only rated for 15 amps at 120 VAC. And with mandatory safety margins, that's a sustained power draw of 1200 watts.

 

Without rewiring homes for 20-amp or 240V outlets, they are banging near the limits of what they have access to and performance gains have to be earned with efficiency.

Link to comment
Share on other sites

10 minutes ago, JB0 said:

There's a light at the end of the tunnel here. A standard US power socket is only rated for 15 amps at 120 VAC. And with mandatory safety margins, that's a sustained power draw of 1200 watts.

 

Without rewiring homes for 20-amp or 240V outlets, they are banging near the limits of what they have access to and performance gains have to be earned with efficiency.

Not to mention they'll end up microwaving the users at the rate they're going!

Link to comment
Share on other sites

Lower overall power requirements—not just power efficiency—is already the next puzzle box to crack. I’m sure nVidia will lure enough day-one buyers with their introductory 4000 cards to make some bank but the ask on price and wattage seems tone deaf. The 3000 series was already hitting the proverbial wall on their high end parts.

Add to that what happened between them and their long-standing board partner eVGA, and I’m just disappointed. The GPU side doesn’t need another Apple-like, and having Apple, Alphabet, Amazon, and Microsoft is enough of those kind of tech shenanigans. Alas, nVidia seems quite keen on that direction.

Link to comment
Share on other sites

Digging into DLSS 3 some more, it seems it IS basically the same type of interpolated frame generation tech that already exists - it renders two frames, then generates one or more frames that get inserted between those two frames, then outputs the result. So while strictly speaking there ARE more frames being sent to the screen, in practice the game will actually be LESS responsive as you'll always be playing one or two frames behind the real game time. I suppose that's where Nvidia Reflex comes in, as it'll presumably minimize this delay. Still, input latency and responsiveness won't actually be any better than if you played with the frame generation turned off.

I imagine in fast-paced competitive shooters players will eschew the tech, but it'll probably be nice for leisurely games like Flight Sim.

Link to comment
Share on other sites

It's not so much continuous use that's the problem. Computer parts do know how to dial back power usage when idle. It's times of sustained loads and/or transient spikes where you're gonna see the light-flickering effect (i.e. using a microwave or turning on that hair dryer). And it's not just the GPU but the CPU and every other component sucking power. Nvidia, Intel, and AMD all need to be thrown under the bus for the increases in power usage.

8 hours ago, pengbuzz said:

Not to mention they'll end up microwaving the users at the rate they're going!

We're not there yet, but you won't be needing a space heater anymore with these new GPUs.

Link to comment
Share on other sites

5 hours ago, kajnrig said:

Digging into DLSS 3 some more, it seems it IS basically the same type of interpolated frame generation tech that already exists - it renders two frames, then generates one or more frames that get inserted between those two frames, then outputs the result. So while strictly speaking there ARE more frames being sent to the screen, in practice the game will actually be LESS responsive as you'll always be playing one or two frames behind the real game time. I suppose that's where Nvidia Reflex comes in, as it'll presumably minimize this delay. Still, input latency and responsiveness won't actually be any better than if you played with the frame generation turned off.

I imagine in fast-paced competitive shooters players will eschew the tech, but it'll probably be nice for leisurely games like Flight Sim.

That's really disappointing. Especially since the problem is more or less solved for VR. They interpolate a new frame based on the current one and the previous one to keep frame rates high when the system is struggling to deliver the headset's native frame rate.

Since a stable high frame rate and latency are both very important to reduce motion sickness, there's no games played.

 

Oculus calls it spacewarp, I forget what Valve's equivalent is.

Link to comment
Share on other sites

28 minutes ago, Hikaru Ichijo SL said:

I was planning on building new system after the i7-13000K came out.  I know wonder if getting a 4000 series is worth it.  Should I get a 3080ti instead.  I will be playing games in 4k soon enough.

I mean, going with a 4000-series is like future proofing, but from what I'm hearing most of the "2-4x" performance gains over the 3080 is actually just in ray tracing, so...

I dunno.  I have a stock 2080 and I can get 4k 60fps with HDR on high or ultra in most games already.  I think it depends on what you're paying for a 3080ti.  I'm still waiting for actual, real-world benchmarks but the 12GB 4080 might be the sweet spot right now.

Link to comment
Share on other sites

23 minutes ago, Hikaru Ichijo SL said:

I was planning on building new system after the i7-13000K came out.  I know wonder if getting a 4000 series is worth it.  Should I get a 3080ti instead.  I will be playing games in 4k soon enough.

That will depend on your budget willingness. GPU+CPU will already be in the $1300+ range (if 13th-gen Intel holds pricing with current 12th-gen + RTX 4080-12GB). Without any reviews on the RTX 4000-series cards, we don't know what kind of true performance gains you'll see over a RTX 3000-series. Add a PSU capable of the load you'll be drawing will put you into the $1500+ range. And we haven't even gotten to RAM and motherboard.

Link to comment
Share on other sites

30 minutes ago, azrael said:

That will depend on your budget willingness. GPU+CPU will already be in the $1300+ range (if 13th-gen Intel holds pricing with current 12th-gen + RTX 4080-12GB). Without any reviews on the RTX 4000-series cards, we don't know what kind of true performance gains you'll see over a RTX 3000-series. Add a PSU capable of the load you'll be drawing will put you into the $1500+ range. And we haven't even gotten to RAM and motherboard.

I have a healthy budget $2200.  So it would not be much of the a problem.

42 minutes ago, mikeszekely said:

I mean, going with a 4000-series is like future proofing, but from what I'm hearing most of the "2-4x" performance gains over the 3080 is actually just in ray tracing, so...

I dunno.  I have a stock 2080 and I can get 4k 60fps with HDR on high or ultra in most games already.  I think it depends on what you're paying for a 3080ti.  I'm still waiting for actual, real-world benchmarks but the 12GB 4080 might be the sweet spot right now.

That is true.  I would have to be blown away to get a 4000 series.

Link to comment
Share on other sites

2 hours ago, mikeszekely said:

I mean, going with a 4000-series is like future proofing, but from what I'm hearing most of the "2-4x" performance gains over the 3080 is actually just in ray tracing, so...

Also in using DLSS for frame interpolation on CPU-bound games. Yeah, that WILL double your frame rate, but it doesn't mean the card's perfoming twice as well. Just means you have frame interpolation.

Link to comment
Share on other sites

Reviews for AMD's Ryzen 7000-series are out. Overall, yes they smoke Intel's 12th gen and do so using less power (which may also hold true for 13th gen). But yeah, they're smokin' hot. AMD is recommending at least a 240mm AIO (or a chunky tower cooler, but better with liquid cooling) for the 7900X and above and a good tower cooler for the 7700X and 7600X (mostly due to a change in how they boost, it will hit thermal limit before frequency limits). Other downsides are DDR5-only support and oddly, the 5800X3D. Apparently that 3D V-cache is holding up versus the new chips (which makes one want to wait for the 7800X3D). 🤔 So if the 5800X3D is a viable upgrade for your aging system...get that instead of the new chips.

Link to comment
Share on other sites

57 minutes ago, azrael said:

Apparently that 3D V-cache is holding up versus the new chips (which makes one want to wait for the 7800X3D). 🤔 So if the 5800X3D is a viable upgrade for your aging system...get that instead of the new chips.

I'm debating whether to do exactly that for my nephew's 1600AF build. Maybe wait until the new 3D vcache chips come out, see it get another good price drop, and swap it out alongside a BIOS update. It makes me slightly upset that his is the only Ryzen build in the house; the other kids' are recycled Intel computers. :lol: One of them has even greater need of an update, but he's of the age now where he wants to work and pay for his own stuff, and he's dedicating more of his money to fixing up a car. Maybe I'll get him some parts for a Christmas gift...

Link to comment
Share on other sites

Intel's 13th-gen announced. On sale on October 20.
Intel’s 13th-gen “Raptor Lake” CPUs are official, launch October 20

The 13600k and 13700k are seeing price hikes while the 13900k is seeing no price change. The expectation is they will be faster than Ryzen 7000, and judging by the power specs, probably just as warm as 12th gen.

Link to comment
Share on other sites

Intel ARC GPU A770 & A750 reviews...Day late and dollar too much today. Had they come out 12-18 months earlier before supply issues leveled out and crypto crashed, they would have been a better value proposition versus the RTX 3060 and RX 6600/6650. Drivers being immature also did not help. They’re good, but they need more time to mature. 

Link to comment
Share on other sites

Reviews of the RTX 4090 Founders' Ed are out. General consensus is it's definitely fast and powerful and brings the A-game while being not-as-power hungry (the expectation was spikes of 700-800W+ but actually spiked only in the 600W-range) nor as much of a space heater as thought. It's a thick-boy and will probably exclude SFF builders. AIB partner cards are just as big if not bigger (but those reviews come out later).

Where the jury is still out is on value. $1600 USD. Also with the prices of the 4080, that's a big paywall for 3rd gen DLSS, RTX and AV1-encoder support.  

Link to comment
Share on other sites

600w is still demanding and the price stinks, IMO. The 4090 is nVidia serving up a titan-class GPU without the Ti branding. Good on them for pushing the envelop on those high-end performance raster numbers, though.

Making two 4080 skus instead of one 4080 and one 4070 is curious. I may wait for nVidia to announce the actual 4070 before taking a closer look at the rest of the line-up. 

I’m also keen on what RDNA 3 is all about and those GPUs release right around the corner in November.

Link to comment
Share on other sites

1 hour ago, technoblue said:

Making two 4080 skus instead of one 4080 and one 4070 is curious.

Scuttlebutt around the industry is that the lower-spec 4080 was supposed to be the 4070, but when they realized how expensive it would be they decided to save the 4070 badge for something they could market closer to the 3070's MSRP.

Link to comment
Share on other sites

1 hour ago, technoblue said:

600w is still demanding and the price stinks, IMO. 

Keep in mind, that’s just the spikes. Average power use on-load was in the 400-500W range. So a 850W PSU can still handle it, but barely. And no breakers tripping, for now. AIB card reviews go up tomorrow so we’ll see what the silicon lottery and what different cooling designs gives us. 

No argument about the price. As I mentioned earlier, the price will be the main talking point of the RTX 4000-series. I also saw listings for the 4090 AIB cards and those are $100-200 above the $1600 starting price. 

Link to comment
Share on other sites

On 10/11/2022 at 2:31 PM, mikeszekely said:

Scuttlebutt around the industry is that the lower-spec 4080 was supposed to be the 4070, but when they realized how expensive it would be they decided to save the 4070 badge for something they could market closer to the 3070's MSRP.

Yeah. I heard that, but I guess it remains to be seen how nVidia will fill out the mid and low range. Curiously, benchmarks for the 4060 and 4060ti are starting to leak already and those leaked numbers are making me wonder all the more how a 4070 might fit in.

On 10/11/2022 at 3:01 PM, azrael said:

Keep in mind, that’s just the spikes. Average power use on-load was in the 400-500W range. So a 850W PSU can still handle it, but barely. And no breakers tripping, for now. AIB card reviews go up tomorrow so we’ll see what the silicon lottery and what different cooling designs gives us. 

Fair point. I watched a couple of the Asus Strix reviews yesterday. That is one monster card! I fear that I'm really not committed to the high end for this generation at all. I will be waiting for more information on the 4070 and 4060 parts. I'm also hoping we get to see 4000 series GPUs that can still fit in small form factor systems. 

Link to comment
Share on other sites

Right? You knew it was a 4070. We knew it was a 4070. You didn't believe it when you said it, we weren't fooled when you said it. Just change the name and charge what you want to charge for the obvious reasons why and just let it be what it be. You're already dead set on this strategy, you don't get to try and claim any sort of value crown either. Stop trying to have your cake and eat it too.

Link to comment
Share on other sites

11 hours ago, mikeszekely said:

How much you want to bet that's exactly what they're going to do?

No doubt. Seeing that the RTX 4070 is probably in production, those 4080-12GB units may become a RTX 4070Ti.

Link to comment
Share on other sites

Intel 13900k & 13600k reviews are out. Definitely faster than competing AMD chips or trades blows and the 13600k likely a better value proposition vs 7600X. Stupidly enough, the 5800X3D is still be a better value. 
And now the depressing news. 13900k runs hot. Much hotter. And consume more power. AMD's design allows Zen 4 to reach TJ max and maintain all-core clocks without thermal throttling. Raptor Lake still follows the traditional model and will hit 100-C TJ max and thermal throttle. At its worst, the 13900k was consuming double the wattage vs the 7950X at 300W+. Most reviews had trouble keeping a 13900k cool with a 360-AIO. A 13900k + RTX 4090 can potentially replace the space heater this winter. And this is at the worst-case scenario usage (which most folks will never reach). And the more depressing news is this will likely be the last LGA-1700 CPU, so there is no upgrade path after this and the next gen will be a completely new socket.

(note these are stock settings without playing with any power settings. It will be interesting to see what happens when people undervolt or play with power settings.)

Link to comment
Share on other sites

Hardware Unboxed did a quick power scaling comparison (the only one I've seen so far, but I'm hoping to see more in the future). The astronomically high TDPs that these new generations were commanding made it seem like they required nothing short of exotic cooling to keep them under control. But the power scaling puts into perspective the generational improvements. The 13900K manages to match the 12900K's Cinebench scores at roughly 170W (compared to I'm guessing the 12900K's full 250W), and the extra 130-150W on top of that gets it an extra ~40% performance increase. (On the AMD side, the 7950X maintains the efficiency edge by getting the same performance at a kind of crazy 75W, at least according to the one HU graph.)

Power throttling these CPUs seems like you'll still get really good performance without crazy cooling requirements. But I dunno... that seems almost anathema to their very design, no? :lol:

 

 

Link to comment
Share on other sites

On the electronics front I upgraded to Google Pixel 7 (non-pro) from my trusty 3.5 year old Pixel 2 last week. My Pixel 2 was still working great (including battery) but no security support in over two years had me concerned so I made the upgrade.

The Pixel 7 is an amazing value and so nice, matches the excellent reviews its been receiving online. I hope to keep my Pixel 7 operating in good condition for at least five years.

When it comes to extending battery life on my phones it best to keep its charge levels between 20% to 80%. I use the Accubattery app to send me a notification when to unplug my phone from its charger when it hits 80%.

Edited by TangledThorns
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...