Posts Tagged ‘nvidia’
In My Humble Opinion I see a fair share of collective groans when Nvidia announces a proprietary feature. The RTX series of cards introduced DLSS, as well as Ray Tracing. Even though the company has been incredibly excited about the new features, many consumers and reviewers are claiming they aren’t a big deal. I get it. PhysX was meant to bring immersive gaming experiences to the action. However, I noticed some heavy gaming lag when tons of particles were flying. Why is Ray Tracing any different?
Nvidia is expected to release 4k, 144 Hz monitors with G-Sync and HDR. The displays are under the Asus and Acer brands. The combination of these specs sounds like a major boon for competitive online gaming, but it would be interesting to see if reviewers confirm that.
The Nvidia GeForce Partner Program makes a good point about fair branding for companies that work hard on designing products, in this case, their GPUs. On the flip side, it can really lock down manufacturers from having freedom to choose AMD GPUs for gaming brands, limiting consumer choice.
Most gamers, and even developers, avoid CrossfireX or SLI like the plague. That's why it's a huge surprise to see a game even adopt multi-GPU configurations, let alone recommend them. Far Cry 5 will like shape up to be an intensive game at 4K resolution.
With Raja starting to work with Intel, it may sound like the company is going to improve their graphics exponentially. On the other hand, iGPUs and dGPUs can add more work to game development, driver integration and component compatibility. Time will tell on what benefit this has on the industry.
We all want to see a new graphics card, but the announcements Nvidia made should not be overlooked. The BFGDs, or Big Format Gaming Displays support 120 Hz at 4k, a big new step. The other huge new is being able to play games at high settings on low end computers using GeForce now.
If I didn't believe in an alternate reality before, I certainly do now...and we're living in it! The real world would never allow Intel and AMD Radeon to join forces and develop a CPU and dGPU solution. Yet here we are. The good news is that this is a great step for low power gaming.
With great performance gains yet again and the adoption of HBM2, the Titan V is set to be the fastest GPU in the graphics industry. Nvidia continues their longstanding history of great gaming and compute performance, as well as amazing energy efficiency.
PureOC will always recommend building your own computer, but we understand that isn't for everyone. iBuyPower does an excellent job selling pre-built custom PCs and we really enjoyed getting a chance to review the Element. Besides, we couldn't say no to the chance to try an Intel i7-7900X and Nvidia GTX 1080 Ti!
AMD's Threadripper CPU has done some crazy things. For the first time in a very long time, an AMD chip showed up at the top of the PassMark CPU chart. The other big first? Nvidia complimented AMD on their release! What is happening in the world!?!
Bringing easy liquid cooling a graphics card is a great idea. The NZXT G12 bracket isn't quite as easy as installing a CLC to a CPU, but it's certainly easier than building a custom cooling loop. The long flat design of air coolers hinders their performance, which is why the NZXT Kraken G12 can perform exponentially.
Nvidia started by making an excellent performing GPU with the GTX 1060. MSI further improves the graphics card by adding the excellent cooling performance of the Twin Frozr. If you want a great gaming card that can handle 1080p, the Gaming X is a great choice. No SLI though! qq
It's easy to point the finger, but I doubt Nvidia was deliberately trying to fake HDR. Even though the monitor technology looks great, it takes a lot of support to implement properly. Firmware, driver, and software issues could all factor in. It's likely too since HDR in the past has been brilliant.
The debate of AMD vs Intel vs Nvidia is an age old one that will probably never cease. As long as there’s innovation in the PC market, there’s bound to be a new top dog in the fight. Intel still has the IPC lead in processing and Nvidia is killing it in performance and energy efficiency with their GPUs. So why, when given a sizeable budget and asked by my friend to build a PC, would I go all out with an AMD base? Check out the video for the full answer because the Red side seems to be offering one of the best, non-compromised experiences we’ve ever seen from them. As always, remember that experiences vary and feel free to add your thoughts to the equation by heading over to the PureOC forums!
I don’t publish on many individual GPUs much these days. It’s not because I don’t want to, it’s just there’s a ton of them and my time is pretty limited. That doesn’t mean I won’t speak up about something that blows my mind a bit. Galax just unveiled the Hall of Fame Edition of their GTX 1080 Ti design and the beast has a 16 phase VRM! That doesn’t include the additional 3 for the memory. It takes a lot of power to keep up with 19 phases, so it should come as no surprise that the card has three 8-pin power connectors. While the bracket only looks to take two slots, the fans stick out far enough to make it a 3 slot card. On the other hand, can you imagine what kind of overclock you could get if you put the GPU under a water block? As long as you don’t hit the silicon ceiling first, I imagine liquid cooling would give you a lot of headroom. Check out the article link for more details, as well as the gallery for a bit of a closer look. I didn’t see a retail price for this beast, but I’m pretty sure it will cost your next unborn child if you want to purchase one. Still, Galax looks like they’re doing a great job here and it’s nice to cover a name that we don’t see in the headlines all the time! http://wccftech.com/galax-geforce-gtx-1080-ti-hof-unveiled/ (Caution: Galax has creepy clown faces on their site. Wha….Hu….WHY!?!) http://www.galax.com/en/