Pureoverclock: PC Hardware reviews and news for overclockers!

 
 
 
 
 

Posts Tagged ‘nvidia’
GALAX-GeForce-GTX-1080-Ti-HOF_6

16 Phase VRM!?!?!?! Galax Unveils Geforce GTX 1080 Ti HoF Edition

I don’t publish on many individual GPUs much these days. It’s not because I don’t want to, it’s just there’s a ton of them and my time is pretty limited. That doesn’t mean I won’t speak up about something that blows my mind a bit. Galax just unveiled the Hall of Fame Edition of their GTX 1080 Ti design and the beast has a 16 phase VRM! That doesn’t include the additional 3 for the memory. It takes a lot of power to keep up with 19 phases, so it should come as no surprise that the card has three 8-pin power connectors. While the bracket only looks to take two slots, the fans stick out far enough to make it a 3 slot card. On the other hand, can you imagine what kind of overclock you could get if you put the GPU under a water block? As long as you don’t hit the silicon ceiling first, I imagine liquid cooling would give you a lot of headroom. Check out the article link for more details, as well as the gallery for a bit of a closer look. I didn’t see a retail price for this beast, but I’m pretty sure it will cost your next unborn child if you want to purchase one. Still, Galax looks like they’re doing a great job here and it’s nice to cover a name that we don’t see in the headlines all the time! http://wccftech.com/galax-geforce-gtx-1080-ti-hof-unveiled/ (Caution: Galax has creepy clown faces on their site. Wha….Hu….WHY!?!) http://www.galax.com/en/



screenshot.6

Nvidia Does it Again! GTX 1080 Ti for Ultimate Gaming Performance

People at this site might think I’m a fanboy of a certain company if they follow me long enough. While I understand that entirely, you have to realize I have a huge sense of awe in what Nvidia has done over the years. They have graphics cards that cost a bit more, but they also have performance and quality that merits it. Even the 4 GB GTX 970 controversy seemed more like one of those occasional oversights to me than a company trying to pull one over on consumers. The GTX 1080 Ti is the pinnacle of Pascal technology and if I had $700 free and clear, you know I’d own one! By now, we understand the basic concepts of Pascal. We have excellent power efficiency, great gaming power, and the same type of overclocking potential Maxwell had. A couple of standout features the 1080 Ti has is a total of 11 GB GDDR5X memory, 3584 CUDA cores and a 7 phase VRM design that I’m sure some OEMs will improve upon. It’s a beast of a card and what’s becoming a usual trend for Nvidia now, is the fact that it offers some really good value compared to its Titan predecessor. Check out Nvidia’s site at the link below for more details and remember, I’m a fanboy of great hardware first and foremost! https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/?nvid=nv-int-g7-8270



CES 2017 Thumb 4

CES 2017: RBT, Samsung, Nvidia, Intel

This is it, the last bit of our coverage of CES 2017. We’ll have one more piece that will go through our top picks, but this is the last of the show floors we were able to get to. We had a mouse that’s trying to put an end to carpal tunnel, some great storage and monitor solutions from Samsung, the HDR display I’ve been dreaming of and some better explanations of what is going on with the architecture in Kaby Lake. CES 2017 was a great show and it sets us up for a great looking year for PC hardware. As always, feel free to leave your comments about what you loved in the forums and keep checking PureOC for more upcoming news and reviews!



PureOC Monthly News

October News Round Up: SWIFTECH Apogee XL2, GAMER STORM Genome II, GTX 1050 Ti and More!

Welcome back to our second roundup of hardware news that happened this month. The CPU and GPU side of things may have slowed down a tad, but there’s a lot that happened with cooling and cases. Let’s dive in and get excited about what’s coming down the road! CPUs Let’s start out with an update on Kaby Lake. Geekbench came out with some impressive results and the story made it look like INTEL has merely been holding back for a while. Recently however, an i5-7500K sample was benchmarked and only showed about a 10% improvement over Skylake. This is actually pretty disappointing, especially considering that the 4970K is still a top tier single thread performer. On the plus side, it looks like all 100 series motherboards will be able to support Kaby Lake with a BIOS update, but I doubt many will see the need to upgrade from Skylake in such a short time table. At the time being, ASUS is the only brand this is confirmed for, but that’s a good indication that other manufactures will follow suit. We’re still a ways out from release so we can almost expect performance to fluctuate a bit more until closer to release. http://wccftech.com/intel-core-i5-7600k-kaby-lake-cpu-benchmarks/ http://wccftech.com/kaby-lake-support-asus-100-series-motherboards-confirmed/ GPUs The big news this month was the release of the GTX 1050 Ti and the 1050. Each GPU offers a great budget option for 1080p gaming while being extremely energy efficient. Rather than reinvent ...



PureOC Monthly News

September News Round Up: Zen, Kaby Lake, GTX 1050 and More

How does a freelance writer keep up with everything happening in the industry when he has a full time job, a family, reviews to keep up with and a puppy in training? It’s worth pointing out that a lot has happened in the month of September this year. AMD has been making announcements about Zen, INTEL has had some details about Kaby Lake and NVIDIA is talking about the GTX 1050. Rather than try to keep up with every piece of news as it releases, we’re going to try something new by having a once a month update to catch you up on the things that are happening in the industry. Welcome to the PureOC Monthly News! On the last Friday of each month, we’ll pull together all the news that feels relevant and give a one stop location to get your info from. So without further ado, let’s get into what’s happened this month. CPUs, Motherboards and Chipsets  AMD Zen, Grey Hawk, X370 Chipset and A12-9800 We’ll start with AMD this time around since much of the anticipation of what will happen in the CPU sector relies on them. We got some more detailed shots of the new Zen chip and the 1331 pin design is confirmed. As far as new info is concerned though, we didn’t get much this month for that. The A12-9800 on the other hand had some very interesting leaks. Some performance scores showed up and even though the new AM4 chip is still using Steamroller cores, it has some reasonable improvements. What was really interesting was an overclocking result. So...



AMD-Nvidia-Feature-635x357

The New GPUs are Great, but the Opinions are Terrible

There’s been so little time and so much going on that’s it’s killing me to not cover all the great things happening in the GPU market lately. The GTX 1080 easily conquered the most powerful graphics card spot, the GTX 1070 fits perfectly into the affordable high performance spot, and now the Rx 480 is a great mid-range main stream card. I want to focus on what NVIDIA and AMD has done for the market, but I also want to bring up how the fan boy wars seem to be destroying the PC building communities as we know them. The GTX 1070 is easily the unsung hero of 2016 release segment so far. While it’s ideally propositioned as a 1440p card, 4k isn’t a stretch with some settings tweaks. The Pascal architecture not only destroys DirectX 11 titles with ease, but much like Maxwell, it overclocks extremely well. The close release after the 1080 overshadowed the card, although it will likely be more popular than the 1080 in the long run. The only serious drawback is that because supply isn’t as high for the time being, retailers are selling the cards at a pretty high price over suggested retail values. Asynchronous support has come up a few times, but Async isn’t widely enough used to be an issue for the time being. My hope is that if Async becomes a boon for game development, that NVIDIA doesn’t keep muddying the waters with their “support.” Of course, this week was the big week for the RX 480 release. I think many of us in th...



ASUS Strix

ASUS ROG Strix GTX 1080: GPU Fan Headers Say What!?!

The GTX 1080 has been making a wave ever since it’s release last week. FinFET is finally here and the wait has been entirely too long! NVIDIA released their own Founder’s Edition that left most of us thinking that we’ll just wait until the manufacturers bring their own variants out, but sometimes it can feel like we already know what we’re going to see. However, ASUS surprised me with a feature that sounds like the perfect combo for a high end GPU with the addition of a fan header on the board. You’re probably thinking I’ve lost it at this point. “You’re seriously getting excited about a fan header!?!” Just bear with me for a sec! Admittedly, this isn’t the kind of feature the majority of users should be particularly concerned about. The ROG Strix has plenty of other things going for it anyways so I’m sure it’s going to be a much coveted card. Where this excites me is in the fact that I’ve been a SLI/Crossfire user for many years now. I understand all the impracticability of it, but what will never change is the fact that I think a computer looks 500 times better with two cards. Unfortunately, these configurations also generate a lot more heat. Under an intensive load, the GPU fans can get revved up pretty high and even though your computer can stay quiet at idle, loads can bring more fan noise than desired. Most of the time, my case fan speeds are set in stone at a level that doesn’t bot...



slides16

NVIDIA Keeps saying Async Support…

“You keep using that term. I do not think it means what you think it means.” Inigo Montoya. Before anyone gets upset about my rant, I want to mention that AMD has been just as guilty of marketing hype. That’s what Pascal’s Async Support is looking like it amounts to and I’m getting tired of companies using our politician’s tactic of repeating the lie to try and convince “dumb” consumers to believe it. A.) We’re smarter than that. B.) The GTX 1080 is such a great release that it doesn’t need false hype to brag it up! Admittedly, while the technicality of the matter could actually mean the 1080 has Async support, it’s beginning to look like NVIDIA is using some marketing tricks to smokescreen what’s really going on. A benchmark came out showcasing various tests with Ashes of the Singularity and a GTX 1080. While Pascal fared better than Maxwell, the results aren’t particularly good. In some cases, we see the performance drop again from DirectX 11 and the gains that were made were practically unnoticeable. The key quote from WCCF in understanding how Pascal supports Async is here. ” Dynamic load balancing and improved pre-emption both improve the performance of async compute code considerably on Pascal compared to Maxwell. Although principally this is not exactly the same as Asynchronous Shading or Computing. Because Pascal still can’t execute async code concurrently without pre-emption. ...



GTX1080_Web_GF-1200x627-7-OG

NVIDIA GTX 1080 Review Round Up

You might be wondering why there was so little news on the site about the upcoming Pascal GPUs from NVIDIA. That’s mostly because the info was pretty iffy most of the time. The great thing about the hardware industry is that new releases can be very exciting, but they can also be over-hyped. I sat back for some time trying to determine if and what I should be covering about this release, ultimately deciding to wait until the reviews came in. Well, they’re in, and the GTX 1080 is a monster of a GPU! Overall, NVIDIA looks like they really outdid themselves on this one, which is impressive considering how well they outdid themselves with Maxwell. The 1080 achieves performance that’s almost impossible to believe and is certainly a worthy successor to the previous 900 series architecture. The only head scratcher that has some reviewers questioning it is in the high price of the founder’s edition. Outside of that, NVIDIA has knocked it out of the park. Here’s a list of reviews that have released already so that you can easily browse through them at your leisure. GTX 1080 Reviews http://www.pcgamer.com/gtx-1080-review/ http://www.techspot.com/review/1174-nvidia-geforce-gtx-1080/ http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,1.html http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572.html http://arstechnica.com/gadgets/2016/05/nvidia-gtx-1080-review/ https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080...



AMD Fx

Is the New AMD Right Around the Corner?

Rumors are flying everywhere about Polaris, Pascal, Zen and even Kaby Lake. While I love following the rumors, not much is being leaked that gives a concrete idea of performance, price, etc. What is catching my eye is what AMD is doing right now. Last week gave us the release of the 16.3 drivers. The release schedule on these drivers is certainly improving, but what really caught my eye was some of the fixes. The bug list is getting smaller and that’s the kind of improvement enthusiasts like to see in drivers. We have to wait for Polaris and Zen to know if AMD’s financial future will look brighter in the years to come, but the things I’m seeing now indicates that they are already turning things around for the better. I already touched on the 16.3 Crimson driver, but I will elaborate further on that. The bug fix I’ve been closely watching is one that involved AMD GPUs losing their clock speed settings during use. This is kind of a big deal and I was fairly certain this was affecting those who were overclocking their video cards. March’s driver fixed that issue and it’s off the list of known issues. In fact, the known issues seem pretty minor now, with most issues related to new game releases. There is the issue of the Gaming Evolved app causing games to crash that I’m hoping get’s resolved soon, but that’s mostly because it keeps crashing my WoW. At least it’s easy to close for temporary fix but for those who use...



Hitman Taking a Contract on Asynchronous Shaders

AMD has been talking up the Asynchronous Compute Engines pretty much since DirectX 12 has been announced. In short, these are hardware components in AMD GPUs that can hopefully be leveraged to add significant performance in games. We’ve been waiting for the final say for some time and while certain Beta releases have shown some promise, it’s only official releases that will not only prove the benefit of Asynchronous Shaders, but will also help determine how legit DirectX 12 is for being the next big thing for gaming. AMD just shared some info that Hitman has been working specifically with them to take advantage of their Asynchronous Shaders and it looks like we have about a month before the official release date. Whether or not Hitman is your kind of game, this will certainly be a big moment in the PC gaming industry. So keep your eyes peeled because March 11th is the official release day for Hitman and I’m sure tech sites will be looking into the performance with DirectX 12 and various AMD and NVIDIA GPUs. Below is the full statement from AMD. AMD is once again partnering with IO Interactive to bring an incredible Hitman gaming experience to the PC. As the newest member to the AMD Gaming Evolved program, Hitman will feature top-flight effects and performance optimizations for PC gamers. Hitman will leverage unique DX12 hardware found in only AMD Radeon GPUs—called asynchronous compute engines—to handle heavier workloads and better image quality with...



Radeon Technologies Group

AMD has a Response to Gameworks with GPUOpen

We all want better graphics in games, but we like those improvements to not utterly destroy our framerates. We’re beginning a new era of PC gaming that is bringing better performance from a software side, rather than completely relying on GPU manufacturers to make beefier chips. NVIDIA GameWorks is an API released in 2014 that was not only supposed to give developers better hardware control, but also offer some rich features for better graphics in game. Unfortunately, the results led to some performance hits in various situations causing varied opinions of the benefit. Now AMD is finally responding with there own developer tool called GPUOpen. AMD seems to be stepping up their game big time and I think GPUOpen will end being a great thing for PC Gamers. As the name implies, GPUOpen is open source, which means there will be a lot more minds trying to optimize the features for performance, as well as being able to share the code without repercussion. This is a good way to bring developers on board with AMD, and should help with some of the disparity we’ve seen in performance from NVIDIA and AMD GPUs from certain game titles. The other interesting thing about GPUOpen is what it could mean for console ports. Radeon Tech is in just about every console right now, but particularly the XBox One and PS4. If the developer tool makes porting to PC that much easier, we may see more console exclusives make their way to PC since development costs can be a major detractor. S...



GF TSMC

What does FinFET mean for Gamers?

There’s a ton of leaks and rumors circulating about the AMD Arctic Islands and NVIDIA Pascal GPUs slated to release next year. Right now, it’s hard to get too excited about anything. Don’t get me wrong, I believe the new releases are going to be phenomenal, but we’re so early before the release of these products that any actual performance numbers are still a long ways off. However, there are still a couple of pieces of information being leaked that are intriguing and quite frankly, should be getting gamers very excited about the games of 2017 and beyond. FinFET is a new process that helps shrink the size of the transistor to 14nm or 16nm. Both TSMC and Global Foundries have managed to get their processes mature enough that they can begin mass production shortly, but we still have to see how good the yields are for determining what the cost of the new GPUs will be. Since graphics cards have been stuck on the 28nm process for quite some time now, it makes sense that efficiency and performance are going to improve significantly. While AMD is claiming double the performance per watt, that can mean little until actual gaming performance is measured. Many times, that double per watt slogan can translate into something quite a bit less than what it sounds like. However, the other factor is the massive amount of transistors that can be squeezed into one die as a result of the shrink. The flagship chip should contain up to 18 billion transistors, over do...



DirectX 12

DirectX 12 is a Great Thing for AMD

And everyone for that matter! But let’s not harsh on my sacrifice of proper grammar for stylistic writing when we can focus on good things from DirectX 12. I’m a firm believer that by the end of next year, DirectX 12 and Vulcan are going to be taking the gaming world by storm. I got a chance to play around with Fable Legends and the graphics were down right amazing! Techspot recently did a comparison of DirectX 11 and 12 to show the FPS gain from a couple of different configurations in Ashes of the Singularity. While the improvements were nice overall, there were some particular gains with FX CPUs that I’ve been waiting to see for quite some time now. Let’s start with the bad news. DirectX 12 is not Bulldozer’s salvation. If anything DirectX 12 is the final nail in that coffin, in the sense that new architecture is long overdue. Shifting away from single-thread performance was never a good move. I still believe AMD was on the right track in that we needed to move to more utilization of multi-threading, especially in gaming, but that shouldn’t have been at the sacrifice of single-thread. Zen is looking to solve these issues, but these initial results are showing an 8350 struggling to keep up with an i3. Even though the i3 is later gen, we’re still talking about a low budget range CPU beating out an enthusiast one for gaming. Now that we got that out of the way though, let’s get on to the good news for AMD. We still have only o...



AshesLogo-fullcolor

What if I told you an R9 290X was competing with a GTX 980 Ti?

So, I’m browsing the internet trying to find out whether I should save some cash and go with the i5-6600K, or go all out with an i7-6700K. I already know that there won’t be much of a discernible difference with current games, but DirectX 12 games are on the horizon, I just got a beta invite to one such game, and there could be some performance to gain from the hyper-threading. I didn’t find the info I wanted, but I did find something astonishing. The GTX 980 Ti seems almost untouchable, but you can imagine my shock when I saw an R9 290X tying, and even beating the Maxwell behemoth in several benchmarks. Yesterday, I did a pretty heavy write-up on some Ashes of the Singularity benchmarks that surfaced about a week ago. It turns out, those weren’t the only ones done. Ars Technica decided to do a very comprehensive set of tests that involved comparing an old school R9 290X with a very state of the art GTX 980 Ti and pretty much showed the card matching the NVIDIA flagship on every turn. The 980 Ti still destroys AMD’s part in DirectX 11, but once we get to 12 we see a super competitive landscape. Here’s a couple patterns I noticed. The GTX card benefits slightly more from 6 cores and hyper-threading than the R9 card does at higher resolutions. The NVIDIA card also has a slight advantage with average framerates during the heavy scenes. An interesting thing that was happening was that once hyper-threading was disabled and the CPU was reduce...






Find us on Google+