Jump to content

Recommended Posts

I used to be a fan of OnePlus, but their rising prices and merger with Oppo put me off them. I got a Galaxy S21 Ultra a little over a year ago, and it's been a mixed bag. The cameras are great and the hardware itself is fine, but I hate that Samsung loaded the thing with their dialer/calendar/mail/contacts/browser/app store/etc instead of (or in addition to) the stock Android apps. And unlike most other pre-loaded apps you can't remove or disable them. I'm really thinking about trading it in for a Pixel 7 Pro.

Link to comment
Share on other sites

48 minutes ago, azrael said:

Might wanna go back to the drawing board with the new PCI-E 600W power connector design.

Another GeForce RTX 4090 16-pin Adapter Bites The Dust

Oh boy, I wonder how many more of these 1) will show up, and 2) will it take before Nvidia takes action, and 3) will it take for Nvidia to issue a recall and/or redesign the board entirely?

I doubt it'll ever get to that third option, but my god imagine the cost of that...

(EDIT: I wonder how fragile those connectors actually are. Is it really taking only minor bends to warp them into fire hazards? Or were these users being particularly aggressive with them? It doesn't seem so from the pictures shared.)

48 minutes ago, azrael said:

It's bad enough that CableMod is putting out a specific 12VHPWR right-angle adapter plug just to deal with the issue.

The way I hear it, the right angle adapter was made for aesthetic reasons first; the practical benefits were a fortunate byproduct. That said, no doubt they'll be marketing the heck out of it now.

 

I was watching a JayzTwoCents video on the issue where he said that AMD was and would be using the same 12VHPWR connector, which appears to have been in error. Scott Herkelman (Radeon senior VP) confirmed:

 

Edited by kajnrig
Link to comment
Share on other sites

6 minutes ago, kajnrig said:

Oh boy, I wonder how many more of these 1) will show up, and 2) will it take before Nvidia takes action, and 3) will it take for Nvidia to issue a recall and/or redesign the board entirely?

I doubt it'll ever get to that third option, but my god imagine the cost of that...

The way I hear it, the right angle adapter was made for aesthetic reasons first; the practical benefits were a fortunate byproduct. That said, no doubt they'll be marketing the heck out of it now.

I was watching a JayzTwoCents video on the issue where he said that AMD was and would be using the same 12VHPWR connector, which appears to have been in error. Scott Herkelman (Radeon senior VP) confirmed

2) & 3), The best Nvidia can do is refund/exchange the cards with the damaged plugs and re-issue new adapters. They'll have to eat the cost but that should have factored in. This isn't really Nvidia's fault, per se, since adapters from Asus, Gigabyte (LOL..PSU drama), and Be Quiet! all seem to be having this issue with melting plugs. People are going to bend those adapters. Nvidia and all the AIBs all designed the adapters within spec. The spec needs to be re-evaluated, now, or everyone needs to put out right angled-plugs to mitigate the problem.

Link to comment
Share on other sites

23 minutes ago, azrael said:

This isn't really Nvidia's fault, per se, since adapters from Asus, Gigabyte (LOL..PSU drama), and Be Quiet! all seem to be having this issue with melting plugs. 

Oh right, I don't mean to imply it is their fault, but they're the ones having to run point on it regardless.

I haven't paid especially close attention; I want to say they were the ones who submitted concerns to Microsoft/SIG about the connector in the first place.

Link to comment
Share on other sites

I take it back...it is Nvidia's fault...kinda...sorta.

The horror has a face – NVIDIA’s hot 12VHPWR adapter for the GeForce RTX 4090 with a built-in breaking point (Igor's Lab)

Quote

The overall build quality of the included adapter for the GeForce RTX 4090, which is distributed by NVIDIA itself, is extremely poor and the internal construction should never have been approved like this. NVIDIA has to take its own supplier to task here, and replacing the adapters in circulation would actually be the least they could do. I will therefore summarize once again what has struck those involved (myself included) so far:

  • The problem is not the 12VHPWR connection as such, nor the repeated plugging or unplugging.
  • Standard compliant power supply cables from brand manufacturers are NOT affected by this so far.
  • The current trigger is NVIDIA’s own adapter to 4x 8-pin in the accessories, whose inferior quality can lead to failures and has already caused damage in single cases.
  • Splitting each of the four 14AWG leads onto each of the 6 pins in the 12VHPWR connector of the adapter by soldering them onto bridges that are much too thin is dangerous because the ends of the leads can break off at the solder joint (e.g., when kinked or bent several times).
  • Bending or kinking the wires directly at the connector of the adapter puts too much pressure on the solder joints and bridges, so that they can break off.
  • The inner bridge between the pins is too thin (resulting cross section) to compensate the current flow on two or three instead of four connected 12V lines.
  • NVIDIA has already been informed in advance and the data and pictures were also provided by be quiet! directly to the R&D department.

 

Link to comment
Share on other sites

Specifically, the wire gauge was... lower? higher?... better, the wire gauge on the GN samples were better, rated for 300W each at 105C, whereas the one Igor's Lab dissected indicated only 150W. Maybe a manufacturing issue, then?

Buildzoid theorized that the issue may be related to Nvidia using cheap double-slit terminals and that even just slightly misaligning those would result in partial contact would result in greater resistance would result in heat buildup.

I dunno. At the very least, it seems to have been narrowed down to Nvidia's adapter specifically. Though @azrael said adapters included with AIB cards are also affected? (Has it been confirmed they source their adapters from Nvidia perhaps?)

Anyway, AMD's RDNA3 announcement tomorrow. In addition to AMD confirming they won't use the 12vhpwr connector, a leaked image of a purported RX 7900 XTX showed 2x 8-pin connectors, indicating no more than 375W of reference power draw. (150W per 8-pin, 75W from PCIe slot). I wonder if that holds true, and if so, how that affects performance relative to the 4090, and how much higher AIBs might be able to push it.

With prices falling on GPUs right now - I saw a 6700XT for less than $350 on Newegg, still very little movement on Nvidia's end though... - now has been such a good time for PC building. We're spoiled for choice at the moment.

Link to comment
Share on other sites

No company has said where they sourced the adapters from so we don't know which adapters are prone to issues. Gamers Nexus' cables were also soldered differently than Igor's Lab's cables so there's also that on top of the 300V vs 150V marked cables. Without knowing where they're sourcing the adapters from, it's hard to say with any conclusive answer. Two suppliers could be making their adapters the same way but only 1 supplier has issues. But no one knows which supplier's cables are bad.

Link to comment
Share on other sites

2 hours ago, kajnrig said:

Specifically, the wire gauge was... lower? higher?... better, the wire gauge on the GN samples were better, rated for 300W each at 105C, whereas the one Igor's Lab dissected indicated only 150W. Maybe a manufacturing issue, then?

The voltage rating of the wires was different and the construction of the connector was different too.

There seem to be (at least) two different revisions of the adapter "in the wild". GN is trying to get adapters from other people, especially people who have had failures or have one with 150V-rated wiring. Because Steve knows we won't be happy until there's a fire! 

Link to comment
Share on other sites

Well, AMD fired a big warning shot across Nvidia's bow.

AMD-RDNA-3-Radeon-RX-7900-XTX-render.jpg

AMD-RDNA-3-Radeon-RX-7900-XT-render.jpg?

Available December 13. And the big kicker, $999 for 7900 XTX and $899 for the 7900 XT. That pretty much ruins Nvidia's value proposition. 

Link to comment
Share on other sites

32 minutes ago, azrael said:

Well, AMD fired a big warning shot across Nvidia's bow.

AMD-RDNA-3-Radeon-RX-7900-XTX-render.jpg

AMD-RDNA-3-Radeon-RX-7900-XT-render.jpg?

Available December 13. And the big kicker, $999 for 7900 XTX and $899 for the 7900 XT. That pretty much ruins Nvidia's value proposition. 

Like, if you're in for $900 already I don't know why would wouldn't spend the extra $100 for the extra 12 compute units, slight clock speed bump, and 4 extra GB of VRAM and get the XTX. I'm really looking forward to seeing real-world performance, though, but I'm guessing a bit better than a 4080 for quite a bit less money. Of course, I'm guessing that'll just be in raw performance. Nvidia's still got a pretty big edge with DLSS 3. Not necessarily a $200-$700 edge, though.

Link to comment
Share on other sites

14 hours ago, JB0 said:

The voltage rating of the wires was different and the construction of the connector was different too.

Gah. Voltage, wattage... 150W, 150V... How to tell you I'm not an electrician without telling you I'm not an electrician... :lol:

1 hour ago, azrael said:

Well, AMD fired a big warning shot across Nvidia's bow.

Available December 13. And the big kicker, $999 for 7900 XTX and $899 for the 7900 XT. That pretty much ruins Nvidia's value proposition. 

I dunno, I think the relative performance and price-to-performance value is a bit vague given how... uncharacteristically loose they were with their numbers. Honestly, the moment they started going on about "8K gaming" my eyes glazed over and my default assumption became "they're comparing non-FSR numbers to FSR numbers, or using a non-standard definition of 8K, or otherwise futzing with the numbers." It's a shame, because they haven't tended to be so transparently fishy in the past with performance claims.

Prices do seem decent - well, "decent" - for now, though.

34 minutes ago, mikeszekely said:

Like, if you're in for $900 already I don't know why would wouldn't spend the extra $100 for the extra 12 compute units, slight clock speed bump, and 4 extra GB of VRAM and get the XTX.

I'm sure that's entirely as intended. I'd have liked to see the 7900XT at something closer to $799 to really take the screws to Nvidia, but even so, they're still relatively VERY well-positioned in comparison.

 

All in all, I'm more interested in the tech behind the cards than the cards themselves. I wonder how long before they get dual GCDs working (and whether such a design will ever make it to a gaming card), when they might release 3D Vcache versions of these cards, what FSR 3.0 and their other software suites entail, etc. FSR 3.0 especially. I hope it's not just real-time frame interpolation same as Nvidia, though I wonder what it could be; surely they can't get "up to 2x" the framerate of FSR 2.x by tuning the algorithm alone?

Also, maybe I just missed it but I didn't see anything that might suggest the inclusion of Xilinx. Were they rumored to be including Xilinx AI/matrix hardware in their CPUs and/or GPUs? Perhaps there is dedicated hardware and that's what's accounting for the performance boost of FSR 3.0 over 2.0? I dunno, I'm just rambling now.

Link to comment
Share on other sites

17 hours ago, kajnrig said:

I dunno, I think the relative performance and price-to-performance value is a bit vague given how... uncharacteristically loose they were with their numbers. Honestly, the moment they started going on about "8K gaming" my eyes glazed over and my default assumption became "they're comparing non-FSR numbers to FSR numbers, or using a non-standard definition of 8K, or otherwise futzing with the numbers." It's a shame, because they haven't tended to be so transparently fishy in the past with performance claims.

Prices do seem decent - well, "decent" - for now, though.

I'm sure that's entirely as intended. I'd have liked to see the 7900XT at something closer to $799 to really take the screws to Nvidia, but even so, they're still relatively VERY well-positioned in comparison.

I suspect AMD wants the 7900XTX/7900XT to compete with the RTX 4080. The RTX 4090 seems way out of league for the price and performance. But yes, where it actually lands on performance will be seen when the reviews come.

Link to comment
Share on other sites

And back we go to the 12vhpwr issue with what seems to be the first report of native 12vhpwr cables also burning: https://www.reddit.com/r/nvidia/comments/yltzbt/maybe_the_first_burnt_connector_with_native_atx30/

Crazy stuff. Maybe it IS a spec thing. Maybe it's a transient spike thing. Maybe wire gauge is/contributes to the problem. What an unexpected ongoing mystery.

(r/Nvidia megathread with a convenient list of confirmed/unconfirmed cases: https://www.reddit.com/r/nvidia/comments/ydh1mh/16_pins_adapter_megathread/)

Edited by kajnrig
Link to comment
Share on other sites

11 hours ago, azrael said:

I suspect AMD wants the 7900XTX/7900XT to compete with the RTX 4080. The RTX 4090 seems way out of league for the price and performance. But yes, where it actually lands on performance will be seen when the reviews come.

So, the guys over at Linus Tech Tips did some estimates based on their benchmarks with the 6950 XT and AMD's "up to 1.5/1.7/whatever" claims.  Of course we should take these sorts of estimates with a big ol' grain of salt until real benchmarks start coming, but the gist is that in terms of raw rendered frames the 7900 XTX could actually be very close to the RTX 4090, and unlike Nvidia AMD is supporting Display Link 2.1.  Plus, the reference cards are close in size to the 6950 XT and use the same pair of 8-pin connectors, so gamers looking to upgrade are less likely to need a new case and/or PSU, too.

The catch seems to be that there's more to it than just drawing frames these days, and again based on AMD's own performance claims over the 6950 XT, the 7900 XTX will lag significantly behind the RTX 4090 with Ray Tracing (they expect around half the framerate you'd get with a 4090).  It also sounds like they're not super enthusiastic about FSR 3 (although, to be fair, their concerns seem to be how FSR 3 will generate extra frames, and they have similar issues with DLSS 3).  Finally, while the 7900 XTX looks like it's going to be great for gaming, a lot of non-gaming GPU-intensive stuff is designed specifically for CUDA, so for stuff like Blender the 7900 XTX might still underperform even compared to a RTX 3090.

Still, unless you're a money-is-no-object type or a professional creative, it looks like the 7900 XTX is going to be a better bang-for-your-buck choice than Nvidia's RTX 4000-series, benchmarks pending.  And this is coming from someone who's used Nvidia exclusively since the newest Windows was XP SP 3.

Link to comment
Share on other sites

32 minutes ago, mikeszekely said:

the gist is that in terms of raw rendered frames the 7900 XTX could actually be very close to the RTX 4090

based on AMD's own performance claims over the 6950 XT, the 7900 XTX will lag significantly behind the RTX 4090 with Ray Tracing (they expect around half the framerate you'd get with a 4090).

In other words, it would be basically repeating what the 6900 XT did for its generation. I do find it slightly amusing that while AMD is basically just repeating the same strategy, the move wasn't met with much fanfare last gen but has gotten A LOT of positive press this gen. I guess the combination of Nvidia again increasing its prices and the 7900XT/X seeming to reach a minimum bar of acceptable raytracing performance is just hitting everyone in the right way.

32 minutes ago, mikeszekely said:

Finally, while the 7900 XTX looks like it's going to be great for gaming, a lot of non-gaming GPU-intensive stuff is designed specifically for CUDA, so for stuff like Blender the 7900 XTX might still underperform even compared to a RTX 3090.

Still, unless you're a money-is-no-object type or a professional creative, it looks like the 7900 XTX is going to be a better bang-for-your-buck choice than Nvidia's RTX 4000-series, benchmarks pending.  And this is coming from someone who's used Nvidia exclusively since the newest Windows was XP SP 3.

Given AMD's steady software maturation over the last few years, I imagine it won't be much longer before they satisfactorily address professional use cases as well. It's certainly looking more likely than it did during the dog years of GCN and Vega.

Link to comment
Share on other sites

11 hours ago, kajnrig said:

And back we go to the 12vhpwr issue with what seems to be the first report of native 12vhpwr cables also burning: https://www.reddit.com/r/nvidia/comments/yltzbt/maybe_the_first_burnt_connector_with_native_atx30/

Crazy stuff. Maybe it IS a spec thing. Maybe it's a transient spike thing. Maybe wire gauge is/contributes to the problem. What an unexpected ongoing mystery.

(r/Nvidia megathread with a convenient list of confirmed/unconfirmed cases: https://www.reddit.com/r/nvidia/comments/ydh1mh/16_pins_adapter_megathread/)

Unfortunately, there are not enough ATX 3.0 PSUs in the wild and until there are, we won't be able to confirm what is going on.

11 hours ago, mikeszekely said:

... the 7900 XTX will lag significantly behind the RTX 4090 with Ray Tracing (they expect around half the framerate you'd get with a 4090).  It also sounds like they're not super enthusiastic about FSR 3 (although, to be fair, their concerns seem to be how FSR 3 will generate extra frames, and they have similar issues with DLSS 3).  Finally, while the 7900 XTX looks like it's going to be great for gaming, a lot of non-gaming GPU-intensive stuff is designed specifically for CUDA, so for stuff like Blender the 7900 XTX might still underperform even compared to a RTX 3090.

While ray-tracing is picking up steam, it's still not greatly accepted. If the RX7000-series does well, hopefully that should give R&D a little more money to start picking up more ray-tracing development.

Link to comment
Share on other sites

  • 2 weeks later...

Reviews for the RTX 4080 are out. Definitely better than the 3080 with around the same power usage but the consensus seems to be "Is it worth the $1200-price tag? 🤷‍♂️". Yeah, the price of the RTX 4000-series seems to be big sticking point for this generation. 

Link to comment
Share on other sites

  • 4 weeks later...

Reviews for the RX 7900 XT and XTX are out. Generally, XTX performs slightly better than 4080, and XT performs slightly worse. In heavily raytraced games, XTX performs around 3090 Ti level.

There seem to be some odd game-specific quirks. Some games have even the XT outperforming the RTX 4090 (that's nine-zero). Others have the XTX scaling negatively with resolution. Some graphs had it at or below 6900 XT performance. Not sure if it's just a driver optimization issue or what, but odd outliers aside, drivers do seem alright overall.

AV1 codec seems good. Production capability still lags behind Nvidia, though seemingly less so than in the past.

All in all, seems like a decent launch.

Link to comment
Share on other sites

Average power usage appears in line with AMD's specs, averaging 350-400+W with transients spikes appearing in the high 600W (but this may be due to the card). It's not as power efficient as the 4080 but improves over last gen's 6950.

We all know ray-tracing is not AMD's strong suit and it shows but rasterization is much better on the XTX vs the 4080.

Pricing, vs the 4080, is obviously better. If all you care about is frames per dollar, get the 7900 XTX. SFF build? AMD is the winner here with size and good-olde 8-pin power connectors.

I'm still going over all the reviews that came out today but if you don't want to pay Nvidia's ridiculous premium, the 7900 XTX might be the way to go if you don't care about RT performance.

Link to comment
Share on other sites

5 hours ago, technoblue said:

I think I’m going to wait to see what AMD’s board partners bring to the table for the 7000 series, but the reviews do seem hopeful. I just wish there was more representation in the sub $600US pricing block.

Yeah, I hope AMD brings price-performance value back to the mid- and lower-range sections. Their price drops during the tail end of this year have been great, and in line with how prices "should" behave in a regular market, but all the same it would be nice to have cards not priced sky-high for a market red hot with crypto demand to begin with. Nvidia are currently still milking those crypto prices, which I really hope backfires on them so they can get back to some semblance of reason.

Link to comment
Share on other sites

If you don't care about ray tracing and just want the most frames for your buck, the RTX 7900 XTX looks like a good choice.  I guess I was hoping for a more decisive win over the RTX 4080, and the RTX 4090 is still the money-is-no-object, absolute best.  The 7900 XT, though, should probably be cheaper- as it stands, for the performance difference you're really better off spending the extra Benjamin on the XTX.

I don't know that you can or should buy a card based on what it might do int he future, but I'll note that the 7000-series reviews mention weird performance dips and buggy drivers.  I seem to recall the 6000-series had similar issues at launch, but eventually they got serious performance boosts through driver updates.  Maybe driver updates for the 7900 XTX will give me that decisive win over the 4080 I was hoping for.

Link to comment
Share on other sites

I'm guessing there are some driver optimizations needed due to the inconsistent gaming benchmarks. There are some apps that will also need updates to accommodate the RX 7000-series seeing that some professional benchmarks did not run on the cards. I recall Nvidia pulled the same thing a few months ago with the 522.25-driver release where it magically brought ~10% performance gains on some games. A vBIOS update might also be needed to do something with the power draw. Of the benchmarks I saw I noticed the 7900XTX seemed...consistently straight on average power. While the 4080/4090 spikes according to usage, the 7900XTX didn't (i.e. power draw with games on Nvidia cards appears much more varied while the 7900XTX average power draw was consistently in the 300-350W range on every game).

The 7900XT does seem like it should have been cheaper for the performance.

Link to comment
Share on other sites

There were also rumors of manufacturing problems that plagued N31, ie 7900 XTX and XT, that resulted in an average miss of ~700MHz in core clock frequency. Coupled with similar rumors of low initial stock and AMD re-taping N31, but also the seen performances of GPUs hitting 2.5-3.0GHz, I'm going to take a guess that they had to discard a lot of the most faulty silicon and rely on the "binned" dies from the faulty lot for the launch batch of cards, which might be adding to bugs and glitches (the power spikes, inconsistent scaling, etc.).

That's just speculation, of course. Here's hoping it is mainly driver issues and can be improved in due time.

Link to comment
Share on other sites

  • 3 weeks later...

Man, I was so eagerly awaiting the 7900 XTX thinking the performance would be where AMD claimed it would be and with the performance per watt improvement that they claimed. AMD was pretty truthful about the Zen 2 gpus and the Ryzen CPUs so I had no reason to doubt them this time around. Too bad it was nowhere near the 50-70% improvement over the 6950 XT and the the Lovelace chips actually turned out to be more power efficient than the RDNA 3 chips.

After finding out what a disappointment the 7900 XTX was, I ended up getting a RTX 4090. No melting cable issues here and I undervolted it so it's drawing much less power while not really losing much performance. 

Now I'm doubly glad I ended up getting the 4090 as it seems the reference design for the 7900 XTX is having heat issues, which der8auer believes is a problem with the vapor chamber design. Apparently there are far more reports of this issue than the melty connectors for the 4090 ever had. 

Link to comment
Share on other sites

I really wanted to build a new computer this year but I'm still finishing up my house... gotta have priorities.

My son did have an issue with his (actually my wife's) ITX computer that made me go get a graphics card. He loves Trailmakers but it would freeze all the time and the music would keep playing. The error log showed a directx issue where it was failing to generate a buffer. I ran Furmark and surprisingly, no issues after 30 minutes. I ran a benchmark test and usually around 10 minutes of I would get a "device hung" error. I ran a bunch of diagnostics and couldn't find any other obvious culprits. Tomorrow, I'm yanking the 1060 and swapping in a 3050 with fingers crossed. Gotta track down an 8 pin power cable though...

Link to comment
Share on other sites

1 hour ago, jenius said:

I really wanted to build a new computer this year but I'm still finishing up my house... gotta have priorities.

There's still time. 😁 All the big announcements came already and there should be no big announcements until Summer 2023 so there's still lots of chances for sales between now and summer. I'm not hopeful though. Most companies are eying lower revenue in 2023 so I don't see many massive price drops this year.

CES starts on Thursday (1/5/2023). We're expecting Nvidia to announce the 4080 12GB  4070Ti and AMD to release non-X Ryzen 7000-series CPUs. I'm more interested in the other tech that will show up at this year's show.

Quote

The error log showed a directx issue where it was failing to generate a buffer. I ran Furmark and surprisingly, no issues after 30 minutes. I ran a benchmark test and usually around 10 minutes of I would get a "device hung" error. I ran a bunch of diagnostics and couldn't find any other obvious culprits. Tomorrow, I'm yanking the 1060 and swapping in a 3050 with fingers crossed. Gotta track down an 8 pin power cable though...

I assume you tried updating or rolling back drivers and/or the vBIOS? Though, given that it is a 1060, I won't be shocked if the hardware was finally giving out.

Link to comment
Share on other sites

10 hours ago, azrael said:

There's still time. 😁 All the big announcements came already and there should be no big announcements until Summer 2023 so there's still lots of chances for sales between now and summer. I'm not hopeful though. Most companies are eying lower revenue in 2023 so I don't see many massive price drops this year.

Thankfully, at the very least, the moratorium on the import tariffs that would have made certain pc parts like GPUs, motherboards, and such even more expensive had it expired has been extended further. 

Edited by MacrossJunkie
Link to comment
Share on other sites

As expected, Nvidia has announced the 4080 12GB 4070Ti at $799, available on January 5th. And it's actually a reasonable girth (2-slot card) unlike it's beefier siblings. It's doubtful that this will push the prices of 3070s/3070Ti's down as most are still selling above "MSRP". 🙄

Link to comment
Share on other sites

12 hours ago, azrael said:

There's still time. 😁 All the big announcements came already and there should be no big announcements until Summer 2023 so there's still lots of chances for sales between now and summer. I'm not hopeful though. Most companies are eying lower revenue in 2023 so I don't see many massive price drops this year.

CES starts on Thursday (1/5/2023). We're expecting Nvidia to announce the 4080 12GB  4070Ti and AMD to release non-X Ryzen 7000-series CPUs. I'm more interested in the other tech that will show up at this year's show.

I assume you tried updating or rolling back drivers and/or the vBIOS? Though, given that it is a 1060, I won't be shocked if the hardware was finally giving out.

I certainly did everything I could with drivers but I did not update the vbios.. huh, I'll give that a go. This will be my first time doing that.

Edited by jenius
Link to comment
Share on other sites

So Murphy's law got me. The rtx 3050 arrived today but I decided I would try to update the vbios but first I checked for a bios update and there was one. So, bios updated (not vbios), I decided to run one last test and also upped the voltage to the card and wouldn't you know it, everything was stable. I played Trailmakers for 20 minutes without a single hang. I did the benchmark that always failed during the second effort four times. Not true stress testing yet but it seems I may have extended the life of my 1060 a bit longer. Now to decide if I should return the 3050...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...