Jump to content

Recommended Posts

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.  If not, I found an alternative, but I was very used to/fond of the one I had been using.     (my main photo viewer---I have thousands of pics of planes, trains, etc, and reference/load them CONSTANTLY for model references etc, so having a super-fast photo viewer is key---and Windows' sucks!) 

Link to post
Share on other sites
2 hours ago, David Hingtgen said:

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.  If not, I found an alternative, but I was very used to/fond of the one I had been using.     (my main photo viewer---I have thousands of pics of planes, trains, etc, and reference/load them CONSTANTLY for model references etc, so having a super-fast photo viewer is key---and Windows' sucks!) 

What photo viewer were you using? I reinstalled/re-enabled the old Windows Photo Viewer from Windows 7 on all my 10 PCs.

Link to post
Share on other sites
4 hours ago, David Hingtgen said:

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.

Yeah, if you updated to Windows 10 May update (ver 2004), expect a bunch of issues. There was a big patch released in July and another big one released in August which resolved a nasty SSD constant defrag issue.

Link to post
Share on other sites

Re-install-----immediately got exact same issue.  Maybe a future update will fix the issue (unlikely) but for now, re-assigning all my context commands etc to the substitute... 

(My god does Windows try to instantly add a zillion thing "edit with Paint, edit with Photos, print, set as desktop, set as background, upload to Cloud" right-click options...)   So hard to just have the two things I want up top of the list.   (I use like, 3 different context-menu-editing programs just to get things set how I want---which is even more work than it took to get Win7 how I wanted)   

Link to post
Share on other sites

I do wish Microsoft was a bit better about letting us turn things off. I understand that most computer users are idiots, and the OS needs to be idiot-proof, but god-damn is it annoying sometimes how little control you have even under the "advanced" options.

I really don't want to have to edit the registry to do things, but if they make me I will.

Link to post
Share on other sites

Was anyone else underwhelmed by the Nvidia 3000-series launch? They were hyping it through the roof as this huge game changer, and I dunno, it seemed like just your standard generational improvement. Maybe it seems more impressive simply because... Turing? Ampere? Whatever 2000-series was called... was such a disappointment. In retrospect, you can really see how much of a transitional generation those cards are.

RTX continuously fails to impress, both in terms of the games using it and the implementation itself. I'm continuously put off by the proprietary nature of it, and the continued massive hit to performance that it entails.

DLSS continues to be the most impressive technology Nvidia are working on. It's legitimately a great thing to have, and it's frustrating that their marketing doesn't focus on it more, instead using it simply to pad their ray-tracing and 4K performance numbers.

Regardless, I'm hoping AMD's event next month won't be as full of "pie in the sky" half-truths and cherry-picked numbers. At the very least, I hope the technology they're working on is more interesting. I forget where I saw it, maybe the Mark Cerny PS5 deep dive, but IIRC their raytracing solution seemed... ultimately less performant, but also more elegant than Nvidia's dedicated hardware solution, and more open to continued development. I believe it involves some/all GPU stream processors "switching mode" from traditional rasterization to ray calculations? I dunno. Either way, I hope this generation will be good competition.

Link to post
Share on other sites
44 minutes ago, kajnrig said:

DLSS continues to be the most impressive technology Nvidia are working on. It's legitimately a great thing to have, and it's frustrating that their marketing doesn't focus on it more, instead using it simply to pad their ray-tracing and 4K performance numbers.

See, DLSS bugs me because it is a feature created to justify otherwise-useless silicon.

Most people don't need hardware-accelerated AI, but nVidia wants to sell the same chip to AI researchers and game players(albeit at very diffrent prices). So nVidia had to create a graphics feature that would run on AI hardware accelerators, and chose fuzzy logic anti-aliasing.

Link to post
Share on other sites
2 hours ago, kajnrig said:

Was anyone else underwhelmed by the Nvidia 3000-series launch? They were hyping it through the roof as this huge game changer, and I dunno, it seemed like just your standard generational improvement. Maybe it seems more impressive simply because... Turing? Ampere? Whatever 2000-series was called... was such a disappointment. In retrospect, you can really see how much of a transitional generation those cards are.

They're better...if you compared it to the 1st generation RTX, not the gen 1.5 cards. :D I haven't seen all the big reviews yet but they seem to point to "card to woo all the non-1st adopters finally".

Link to post
Share on other sites
2 hours ago, kajnrig said:

Was anyone else underwhelmed by the Nvidia 3000-series launch?

Still waiting to see benchmarks, but my opinion is basically that if I had a 10-series card I'd upgrade, but I have an RTX 2080 so I'm going to wait for whatever comes next.  40-series, I guess.

I'm actually pretty happy with my current setup.  I can play most games at or near max settings near 60fps and 4k on my desktop, while the 2060 Max-Q does the same (or better) at 1080p (which is fine, because it only has a 14" 1080p display).

Link to post
Share on other sites
2 hours ago, JB0 said:

See, DLSS bugs me because it is a feature created to justify otherwise-useless silicon.

Most people don't need hardware-accelerated AI, but nVidia wants to sell the same chip to AI researchers and game players(albeit at very diffrent prices). So nVidia had to create a graphics feature that would run on AI hardware accelerators, and chose fuzzy logic anti-aliasing.

Oh it also involves dedicated hardware? Well, never mind, then, there goes all of my interest in that. I thought it was a software thing, but that would explain why it's missing from 10-series cards.

Link to post
Share on other sites
10 hours ago, mikeszekely said:

Still waiting to see benchmarks, but my opinion is basically that if I had a 10-series card I'd upgrade, but I have an RTX 2080 so I'm going to wait for whatever comes next.  40-series, I guess.

I'm actually pretty happy with my current setup.  I can play most games at or near max settings near 60fps and 4k on my desktop, while the 2060 Max-Q does the same (or better) at 1080p (which is fine, because it only has a 14" 1080p display).

Yeah, after almost 10 years, I picked up a new CPU in February, and the 2080 super is taking everything I throw at it, so I'm in no rush to change anything.

Not to mention, you just know that any new generation of hardware is going to force you onto whatever the newest version of Windows is, and I have gone to extraordinary lengths to bind and gag Win10 1909 until you can't even hear it whimper.  I have the computer running smoothly the way I want, and I have no intention of changing that for at least a couple years.

Link to post
Share on other sites

I am planning to get a rtx3080. I stopped serious gaming for awhile, I do not know if it is because I lost interest or my current pc can't running anything new. I want to go 4k.  I need a new computer and monitor one of these days.  I am running a I7-2600 cpu.  The computer is slowing down a bit and I need more ram and some USB 3.0. 

I am definitely building it myself this time.

Link to post
Share on other sites
  • 2 weeks later...

So...3080 launch was a launch-crash-then-burn. Nvidia is using higher-end capacitors while some board-partners are using cheaper ones that are causing crash-to-desktop when the GPU reaches >2.0Ghz speeds.

And the 3090...falling into the "not-really-worth-the-price-tag-unless-you-really-need-it"-category. :unsure:

I'm moving to "cautiously-optimistic" on that 3070 release in a couple of weeks.

Link to post
Share on other sites
18 minutes ago, azrael said:

I'm moving to "cautiously-optimistic" on that 3070 release in a couple of weeks.

The 3090 seems to have features that are good for content creators, but the gaming performance over the 3080 was super disappointing. And the 3080, while an improvement over the 2080, was hardly the leap Nvidia was suggesting.

At this rate I'm thinking the 3070 will be the sweet spot in terms of cost vs performance if you're building a new machine or upgrading from a 10-series or older. But they're really not giving me a compelling reason to replace a 2080.

Link to post
Share on other sites

I have a newer Ryzen system with a 10-series card and an older Ivy Bridge system (which I'm upgrading soon).

The upgraded Ivy Bridge system will probably get a new video card, but I'm willing to wait it out and see how things settle down after Big Navi.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...