Jump to content

Recommended Posts

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.  If not, I found an alternative, but I was very used to/fond of the one I had been using.     (my main photo viewer---I have thousands of pics of planes, trains, etc, and reference/load them CONSTANTLY for model references etc, so having a super-fast photo viewer is key---and Windows' sucks!) 

Link to comment
Share on other sites

2 hours ago, David Hingtgen said:

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.  If not, I found an alternative, but I was very used to/fond of the one I had been using.     (my main photo viewer---I have thousands of pics of planes, trains, etc, and reference/load them CONSTANTLY for model references etc, so having a super-fast photo viewer is key---and Windows' sucks!) 

What photo viewer were you using? I reinstalled/re-enabled the old Windows Photo Viewer from Windows 7 on all my 10 PCs.

Link to comment
Share on other sites

4 hours ago, David Hingtgen said:

I'm pretty sure I pinpointed it to a "program that Win10 no longer likes due to Win10's last update".   Uninstalled it, will re-install fresh and see if it behaves any better.

Yeah, if you updated to Windows 10 May update (ver 2004), expect a bunch of issues. There was a big patch released in July and another big one released in August which resolved a nasty SSD constant defrag issue.

Link to comment
Share on other sites

Re-install-----immediately got exact same issue.  Maybe a future update will fix the issue (unlikely) but for now, re-assigning all my context commands etc to the substitute... 

(My god does Windows try to instantly add a zillion thing "edit with Paint, edit with Photos, print, set as desktop, set as background, upload to Cloud" right-click options...)   So hard to just have the two things I want up top of the list.   (I use like, 3 different context-menu-editing programs just to get things set how I want---which is even more work than it took to get Win7 how I wanted)   

Link to comment
Share on other sites

I do wish Microsoft was a bit better about letting us turn things off. I understand that most computer users are idiots, and the OS needs to be idiot-proof, but god-damn is it annoying sometimes how little control you have even under the "advanced" options.

I really don't want to have to edit the registry to do things, but if they make me I will.

Link to comment
Share on other sites

Was anyone else underwhelmed by the Nvidia 3000-series launch? They were hyping it through the roof as this huge game changer, and I dunno, it seemed like just your standard generational improvement. Maybe it seems more impressive simply because... Turing? Ampere? Whatever 2000-series was called... was such a disappointment. In retrospect, you can really see how much of a transitional generation those cards are.

RTX continuously fails to impress, both in terms of the games using it and the implementation itself. I'm continuously put off by the proprietary nature of it, and the continued massive hit to performance that it entails.

DLSS continues to be the most impressive technology Nvidia are working on. It's legitimately a great thing to have, and it's frustrating that their marketing doesn't focus on it more, instead using it simply to pad their ray-tracing and 4K performance numbers.

Regardless, I'm hoping AMD's event next month won't be as full of "pie in the sky" half-truths and cherry-picked numbers. At the very least, I hope the technology they're working on is more interesting. I forget where I saw it, maybe the Mark Cerny PS5 deep dive, but IIRC their raytracing solution seemed... ultimately less performant, but also more elegant than Nvidia's dedicated hardware solution, and more open to continued development. I believe it involves some/all GPU stream processors "switching mode" from traditional rasterization to ray calculations? I dunno. Either way, I hope this generation will be good competition.

Link to comment
Share on other sites

44 minutes ago, kajnrig said:

DLSS continues to be the most impressive technology Nvidia are working on. It's legitimately a great thing to have, and it's frustrating that their marketing doesn't focus on it more, instead using it simply to pad their ray-tracing and 4K performance numbers.

See, DLSS bugs me because it is a feature created to justify otherwise-useless silicon.

Most people don't need hardware-accelerated AI, but nVidia wants to sell the same chip to AI researchers and game players(albeit at very diffrent prices). So nVidia had to create a graphics feature that would run on AI hardware accelerators, and chose fuzzy logic anti-aliasing.

Link to comment
Share on other sites

2 hours ago, kajnrig said:

Was anyone else underwhelmed by the Nvidia 3000-series launch? They were hyping it through the roof as this huge game changer, and I dunno, it seemed like just your standard generational improvement. Maybe it seems more impressive simply because... Turing? Ampere? Whatever 2000-series was called... was such a disappointment. In retrospect, you can really see how much of a transitional generation those cards are.

They're better...if you compared it to the 1st generation RTX, not the gen 1.5 cards. :D I haven't seen all the big reviews yet but they seem to point to "card to woo all the non-1st adopters finally".

Link to comment
Share on other sites

2 hours ago, kajnrig said:

Was anyone else underwhelmed by the Nvidia 3000-series launch?

Still waiting to see benchmarks, but my opinion is basically that if I had a 10-series card I'd upgrade, but I have an RTX 2080 so I'm going to wait for whatever comes next.  40-series, I guess.

I'm actually pretty happy with my current setup.  I can play most games at or near max settings near 60fps and 4k on my desktop, while the 2060 Max-Q does the same (or better) at 1080p (which is fine, because it only has a 14" 1080p display).

Link to comment
Share on other sites

2 hours ago, JB0 said:

See, DLSS bugs me because it is a feature created to justify otherwise-useless silicon.

Most people don't need hardware-accelerated AI, but nVidia wants to sell the same chip to AI researchers and game players(albeit at very diffrent prices). So nVidia had to create a graphics feature that would run on AI hardware accelerators, and chose fuzzy logic anti-aliasing.

Oh it also involves dedicated hardware? Well, never mind, then, there goes all of my interest in that. I thought it was a software thing, but that would explain why it's missing from 10-series cards.

Link to comment
Share on other sites

10 hours ago, mikeszekely said:

Still waiting to see benchmarks, but my opinion is basically that if I had a 10-series card I'd upgrade, but I have an RTX 2080 so I'm going to wait for whatever comes next.  40-series, I guess.

I'm actually pretty happy with my current setup.  I can play most games at or near max settings near 60fps and 4k on my desktop, while the 2060 Max-Q does the same (or better) at 1080p (which is fine, because it only has a 14" 1080p display).

Yeah, after almost 10 years, I picked up a new CPU in February, and the 2080 super is taking everything I throw at it, so I'm in no rush to change anything.

Not to mention, you just know that any new generation of hardware is going to force you onto whatever the newest version of Windows is, and I have gone to extraordinary lengths to bind and gag Win10 1909 until you can't even hear it whimper.  I have the computer running smoothly the way I want, and I have no intention of changing that for at least a couple years.

Link to comment
Share on other sites

I am planning to get a rtx3080. I stopped serious gaming for awhile, I do not know if it is because I lost interest or my current pc can't running anything new. I want to go 4k.  I need a new computer and monitor one of these days.  I am running a I7-2600 cpu.  The computer is slowing down a bit and I need more ram and some USB 3.0. 

I am definitely building it myself this time.

Link to comment
Share on other sites

  • 2 weeks later...

So...3080 launch was a launch-crash-then-burn. Nvidia is using higher-end capacitors while some board-partners are using cheaper ones that are causing crash-to-desktop when the GPU reaches >2.0Ghz speeds.

And the 3090...falling into the "not-really-worth-the-price-tag-unless-you-really-need-it"-category. :unsure:

I'm moving to "cautiously-optimistic" on that 3070 release in a couple of weeks.

Link to comment
Share on other sites

18 minutes ago, azrael said:

I'm moving to "cautiously-optimistic" on that 3070 release in a couple of weeks.

The 3090 seems to have features that are good for content creators, but the gaming performance over the 3080 was super disappointing. And the 3080, while an improvement over the 2080, was hardly the leap Nvidia was suggesting.

At this rate I'm thinking the 3070 will be the sweet spot in terms of cost vs performance if you're building a new machine or upgrading from a 10-series or older. But they're really not giving me a compelling reason to replace a 2080.

Link to comment
Share on other sites

Apparently the 3000-series crash to desktop issues are a combination of AIBs using lower-quality capacitors (that are still within Nvidia spec, so don't blame them) and Nvidia not providing proper drivers for them to validate their card designs pre-launch and catch any such potential problems.

There have been a bunch of videos, write-ups, AIB statements, etc., regarding the matter. All in all, it seems like Nvidia really mucked things up for themselves and their board partners, marring what is an otherwise pretty good product.

They've released updated drivers now, which IIRC EVGA have said tentatively fixes the stability issues on their lower-end cards.

Link to comment
Share on other sites

On 9/28/2020 at 12:35 PM, mikeszekely said:

The 3090 seems to have features that are good for content creators, but the gaming performance over the 3080 was super disappointing.

Which is how they were pitching it until immediately before launch. Then marketing suddenly decided "This isn't a new Titan, it is the most powerful gaming card ever wrought by the hands of mortals!", and... well, it isn't THAT good at games.

 

On 9/28/2020 at 12:40 PM, technoblue said:

I have a newer Ryzen system with a 10-series card and an older Ivy Bridge system (which I'm upgrading soon).

The upgraded Ivy Bridge system will probably get a new video card, but I'm willing to wait it out and see how things settle down after Big Navi.

Haswell and a Radeon 580 here.

I'm starting to feel the urge to rebuild, but it isn't necessary. The loss of my VR space greatly reduced my power needs.

Link to comment
Share on other sites

4 hours ago, JB0 said:

Haswell and a Radeon 580 here.

I'm starting to feel the urge to rebuild, but it isn't necessary. The loss of my VR space greatly reduced my power needs.

My Ryzen system is an R5 3600 mini-ITX build that replaced a larger Haswell HTPC. I’m looking to repurpose the Haswell system as a DIY NAS but haven’t had time to dive into the options.

The Ivy Bridge system is my daily driver. The upgraded parts should be able to handle a little bit of everything, while also running games well at UWHD resolutions. I’m thinking of using the R9 3900X for this build. 

Link to comment
Share on other sites

I can't keep track of all the names... Haswell was 4th-gen, and Ivy Bridge was 3rd-gen, right?  All I know is that until January my main computer was a 3rd-gen i7 and a GTX 970 to a 9th-gen i7 and an RTX 2080.  If I'd held off a few more months then sure, I'd have probably gone with a Ryzen 9 3900 and an RTX 3080, but in my experience there's always something better around the corner and I don't really regret it.  I'll probably stick with CPU for a few years, and the GPU until there's an RTX 4080.

I retired my laptop this year, too.  At home I strongly prefer my desktop, but we have family abroad and prior to COVID I had to travel often enough that I like to keep a thin-and-light gaming laptop.  My old one was a Razer Blade with a 4th-gen i7 and a GTX 970m.  My new laptop is an Asus Zephyrus with a Ryzen 9 4900HS and an RTX 2060 Max-Q.  I find that it compliments my desktop setup pretty well; I have a 4K display for my desktop, and I've found that if I can get close enough to 60fps at 4K with given settings in a game on my desktop then I'll be able to get the same or better framerate with the same settings at 1080p (the laptop's display resolution).

Although, I wonder how that laptop handles ray tracing?  I just started playing Control, and it's the first game I've tried that supports it.  Seems to run fairly well on my desktop, but I haven't tried it on my laptop.  Maybe I'll do that tomorrow.

Link to comment
Share on other sites

Right. Haswell motherboards would take fourth generation i7-4###, i5-4###, and i3-4### CPUs. Ivy Bridge was an update to Sandy Bridge and supported third generation CPUs. My desktop is currently running an i5-3570k. My old HTPC was running an i3-4170t before the Ryzen upgrade. I like to keep that system cool and quiet. Moving to the R5 3600 was a nice boost.

I don’t have my own personal laptop at the moment, so I’m not really familiar with what’s available in that space. The job gave me a Dell laptop, but it’s outfit for business use.

Edited by technoblue
Link to comment
Share on other sites

4 hours ago, technoblue said:

I don’t have my own personal laptop at the moment, so I’m not really familiar with what’s available in that space. The job gave me a Dell laptop, but it’s outfit for business use.

It depends a lot on what you're going for.  It boils down to cheaper, general-use stuff, super thin, high-end laptops with powerful CPUs but still using integrated GPUs, gaming laptops with some sacrifices to stay thin and light, and behemoth gaming laptops with the same GPU chipsets at desktop cards.  I fall into the third group- the RTX 2060 Max-Q is clocked lower than the desktop 2060 for thermal reasons so they can stuff it into a 14" laptop that's just a tad thicker and heavier than a MacBook.  But, since I do most of my gaming on a desktop and just need something to play on when I travel that's a fair sacrifice.

I would say that anyone shopping for a laptop should look at ones with Ryzen.  The Ryzen 9 4900HS in my laptop scored better on most benchmarks than the i7-9700K in my desktop.

Link to comment
Share on other sites

1 hour ago, kajnrig said:

What's the Core generation that everyone seems to agree was a sort of paradigm shift in processor tech? The one that overclocks really well and is seen as still being viable even today? Was it the 2000 chips? 3000? 4000?

Not sure about the older generations. Haswell-4th gen (4000-series) was probably the pinnacle back in 2013. My ITX box still uses a Haswell chip. 6th-gen Skylake chips was likely the next jump, but that's also where Intel goofed and started trending down. 2018 was where people were saying about Intel, "WTF". Intel, traditionally, overclocks well but that's all they had going for them after Skylake. When I last upgraded, I had the option to do 8th or 9th gen Coffee Lake. At the time, the difference between the 2 generations was negligible so I opted for 8th-generation. 

Link to comment
Share on other sites

2 hours ago, kajnrig said:

What's the Core generation that everyone seems to agree was a sort of paradigm shift in processor tech? The one that overclocks really well and is seen as still being viable even today? Was it the 2000 chips? 3000? 4000?

Hm. I would say the Intel chipset that first got the attention of enthusiasts was Sandy Bridge, so that's the 2000 series CPUs. Ivy Bridge was the same 1155-socket with minor updates. Haswell/Broadwell added quality-of-life improvements (additional USB ports and SATA lanes), and support for the 1150-socket CPUs. Intel also had motherboards which supported their Xenon processors, in case you wanted to build your own in-house server. After Broadwell, we're getting into the recent 1151- and 1200-socket "Lake" CPUs. I agree with @azrael about these later generations. The lines started to blur.

AMD had a rougher road. I'm trying to think back. The Athlon XP and the AMD 760 chipset were solid. The Opteron was a good update, but I remember the AMD 8000 chipset was a little flaky back in the day. AMD made its next big splash with the Athlon64, Phenom, and Turion processors all which received incremental updates and chipsets to match. AMD faltered with Bulldozer. Now we have the AM4 chipset with Zen CPU support, though, so things are looking up.

2 hours ago, mikeszekely said:

It depends a lot on what you're going for.  It boils down to cheaper, general-use stuff, super thin, high-end laptops with powerful CPUs but still using integrated GPUs, gaming laptops with some sacrifices to stay thin and light, and behemoth gaming laptops with the same GPU chipsets at desktop cards.  I fall into the third group- the RTX 2060 Max-Q is clocked lower than the desktop 2060 for thermal reasons so they can stuff it into a 14" laptop that's just a tad thicker and heavier than a MacBook.  But, since I do most of my gaming on a desktop and just need something to play on when I travel that's a fair sacrifice.

I would say that anyone shopping for a laptop should look at ones with Ryzen.  The Ryzen 9 4900HS in my laptop scored better on most benchmarks than the i7-9700K in my desktop.

Thanks! I may start looking again down the road, but I'm not in a rush now. I do like the idea of having a thin client to write on when I'm traveling, especially one that sips power when its on battery. The Zen CPUs do offer a lot of promise in portable units. I'm also curious to see how the Zen 3 5000 series could change the landscape with integrated graphics.

Edited by technoblue
Link to comment
Share on other sites

33 minutes ago, technoblue said:

Thanks! I may start looking again down the road, but I'm not in a rush now. I do like the idea of having a thin client to write on when I'm traveling, especially one that sips power when its on battery. The Zen CPUs do offer a lot of promise in portable units. I'm also curious to see how the Zen 3 5000 series could change the landscape with integrated graphics.

In my travels with a laptop, lightweight is definitely preferable. I don't normally game while traveling so a dedicated GPU is not on my radar. I use a 13" Macbook Pro for home and as backup to my desktop (It's basically a desktop lite). For my work laptop, I use a Macbook Air.

That being said, get a Ryzen laptop, if you can find a good one. Performance-wise, it's a better value right now.
 

Link to comment
Share on other sites

  • 2 weeks later...
  • 3 weeks later...

RTX 3070 reviews are showing up and Founder's Ed cards are now on sale. Yeah, it holds up to Nvidia's claims, for the most part.

Now for the other question...can anyone get a card. Given the 3rd party card fiasco, I'll wait till things are settled before grabbing one.

Link to comment
Share on other sites

The full event, in case you wanted to watch it:

All in all, it all looks compelling enough, but I was hoping to get a deeper dive not just into the hardware but the software as well. I suppose that's what Gamers Nexus is for...

I'm most curious about the Smart Access Memory feature, and whether it can be implemented on earlier Ryzen generations or even Intel CPUs, or if one could do the same with Nvidia cards, etc. It's a very console-like feature.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...