Home / Reviews / Nvidia GeForce GTX 1080 Ti Review – 4K Gaming Powerhouse

Nvidia GeForce GTX 1080 Ti Review – 4K Gaming Powerhouse

The all new Nvidia GeForce GTX 1080 Ti based on Pascal architecture. Armed with 11GB of GDDR5X graphics memory and an all new GP102-350 GPU, we are certainly we’re gonna break some records today. It has been eight months since Nvidia released the first GP102 based product, the Titan X. To date, a massively impressive graphics card that will resemble what we review today and very similar on a lot of levels. Really, the 1080 Ti is the Titan X, just with one GB of that GDDR5X memory less and the one ROP partition tied to it.

So let me break it down swift and fast, the new high-end GTX 1080 Ti features 3584 CUDA Cores, 224 Texture Units, a 352-bit memory controller and 11 GB of faster (11 Gbps) GDDR5X memory. The card has the same “GP102” GPU as the TITAN X Pascal, but the GTX 1080 Ti was slighty reconfigured. Most interesting is the 352-bit wide GDDR5X memory interface, this was not expected. This translates to 11 memory chips on the card which run at 11 GHz (GDDR5X-effective), the memory bandwidth is 484 GB/s. This invokes the change in ROP count to 88 (from 96 on the TITAN X Pascal), and the TMU count of 224. A reference Ti card will boost up to roughly 1600 MHz, but the overclocking potential (boost requency) is much like all GeForce Pascal cards, in the 2 GHz range.

 

Nvidia GTX 1080 Ti Nvidia Titan X GeForce GTX 1080 Geforce GTX Titan X
Architecture Pascal Pascal Pascal Maxwell
GPU GP102-350 GP102-400 GP104-400 GM200
Fab 16nm Finfet 16nm Finfet 16nm Finfet 28nm
Shader procs 3584 3584 2560 3072
Base 1480 MHz 1417 MHz 1607 MHz 1000 MHz
Boost 1582 MHz 1531 MHz 1733 MHz 1075 MHz
Perf 11.5 TFLOPS 11 TFLOPS 8,87 TFLOPS 6,6 TFLOPS
Mem 11GB GDDR5X 12GB GDDR5X 8 GB GDDR5X 12 GB GDDR5
Mem freq 11000 MHz 10000 MHz 10000 MHz 7000 MHz
Mem bus 352-bit 384-bit 256-bit 384-bit
Mem bandw 484 GB/s 480 GB/s 320 GB/s 336,5 GB/s
TDP 250W 250W 180W 250 W

As such, if you are a 1080p gamer, this card might be a bit out of your comfort zone; it isn’t going to make much sense as at 1920×1080 you’ll be limited by your processor, as yes, even the fastest 8 and 10 core processors will not be able to keep up properly. The gamers that can afford it and will purchase this product will need to focus on at least Wide Quad HD (2560×1440) or Ultra HD gaming mostly. Below that resolution, honestly mate, go look at a GTX 1070 or 1080. The Pascal GP102, fabbed at a 16nm node with fins, that smaller 16nm FinFET fabrication process works out really well for Nvidia. The 1050, 1060, 1070 and 1080 have been a high clocked success story ever since their launch.

The FinFET 16nm node works out well for Nvidia. Much like the 1080 architecture you’ll again spot high clocks and again a very nice memory configuration (11 GHz effective!), this a product series that will be massively interesting, but surely expensive. The Nvidia GeForce GTX 1080 Ti will receive a similar looking design just like the 1070/1080 Founder edition coolers, with some aesthetic tweaks of course. The Pascal based unit is a bit of a beast alright. Tied to a 7+2 phase power (2x dual-FET power) delivery the GPU die size is 471 sq mm. If you look at the wider product stack, then a GeForce GTX 1080 has 2,560 shader processors, the GeForce GTX 1070 has 1,920 shader processors, the GeForce GTX 1060 has 1,280 of them. The Nvidia GeForce 1080 Ti has 3,584 shader processors active inside that GP102 GPU, I say active here deliberately as it still isn’t even a fully enabled GPU. This means it is has 28 SMs active (28 streaming multi-processors x 128 shader cores (2×64). The cards will be equipped with fast GDDR5X memory as well for this 11 GB model. That memory is tied to a 352-bit wide bus locked in at 11 GHz (GDDR5X-effective). The combination of that memory type and clock frequency gives the 1080 Ti an effective memory bandwidth of 484 GB/s. But let’s compare some arbitrary numbers a bit in order to realize what the product can do.

  • GeForce GTX 1080 Ti offers 11.5 TFLOP/s Single-precision floating point performance
  • Nvidia Titan X (Pascal GP102) offers just over 11 TFLOP/s Single-precision floating point performance
  • GeForce Titan X (Maxwell GM200) offers just over 7 TFLOP/s Single-precision floating point performance
  • GeForce GTX 1080 offers just over 9 TFLOP/s Single-precision floating point performance
  • GeForce GTX 1070 offers just over 6 TFLOP/s Single-precision floating point performance
  • GeForce GTX 1060 offers just over 4 TFLOP/s Single-precision floating point performance
If you combine the specs you will get a bit dizzy I guess, but considering we’ll be looking at the product from a gaming point of view, I can say this card will run awesome in the Ultra HD domain with titles like GTA-V, Resident Evil 7, Battlefield 1, The Division and many other hip ‘n trendy game titles. We’ll also look at Gears of War 4, Dishonored 2, Doom, Watch Dogs 2 and Sniper Elite 4.
Over the next pages, we’ll go a little deeper into the technology and architecture, but not too deep as we have a lot to cover. The graphics card has been fitted with a powerful dual-slot single-fan cooler. The temperature of this card will sit at the 80 degrees C marker under full load, and is reasonably quiet. The new GTX 1080 Ti runs up-to a boost clock of 1582 MHz (1480 MHz base clock). The memory is clocked reference at 11 Gbps. The card is fitted with both a 6 and 8-pin power connector and yes, we’ll overclock today as well. Let’s get this review started, but not before you have had a chance to actually look at the card of course.

Product Showcase:

So as you can see, the GeForce GTX 1080 Ti is looking quite familiar! It looks much like the previous GeForce GTX 10 series models, again a similarly fashioned cooler is being used. This round however there are subtle changes in aesthetics, with the cooler shell showing triangles (the base unit for render polygons/objects/scenes). There’s a LED in the top side fan housing (controlled with GeForce Experience) and the GeForce GTX is displayed / lit up.


The card once again is your standard dual-slot solution, its cooling is vapor chamber based. The cooler’s Plexiglass allows you to actually look into the heatsink’s aluminum fins. The fan is outfitted with a special design, its airflow is carefully directed to take in air from the PC and exhaust it outside the PC, in order to optimize cooling efficiency while minimizing noise causing restrictions. The card has a maximum power design of roughly 250 Watts, and yes, these are made to overclock as well. As such, Nvidia is using one 8-pin and one 6-pin PEG (PCI Express Graphics) connector (150+75 Watts). Another 75 to 150 Watts is delivered though the PCIe slot and thus motherboard.

 

As you can see, a two-part back-plate. The opinions on back-plates differ per person. Of course they protect the backside of the PCB and its components, but back-plates can also easily trap heat. This design seems to do just that, hence I am not a fan of it. I would have liked to have seen many meshes and airflow vents. And then they are often added for aesthetic reasons of course.

 

To improve the performance of the GPU cooler, NVIDIA engineers have designed a new high airflow thermal solution that provides 2x the airflow area of the GeForce GTX 1080. To accomplish this, the DVI connector that’s traditionally been placed above the DVI and HDMI connections on the card bracket has been removed. Instead, this area is used to provide a larger exhaust for hot air to be expelled from the GPU. The card will offer four display connectors; you’ll spot three DisplayPort connectors, one full size HDMI connector. Nvidia is including a DP to DVI converter, should you need the DVI connector on your monitor. DisplayPort is 1.2 certified and DP 1.3/1.4 Ready, enabling support for 4K displays at 120Hz, 5K displays at 60Hz, and 8K displays at 60Hz (using two cables). The card includes three DisplayPort connectors, one HDMI 2.0b connector, and one dual-link DVI connector. Up to four display heads can be driven simultaneously from one card. The GTX 1080 Ti display pipeline supports HDR gaming, as well as video encoding and decoding. New to Pascal is HDR Video (4K@60 10/12b HEVC Decode), HDR Record/Stream (4K@60 10b HEVC Encode), and HDR Interface Support (DP 1.4).

 

The 1080 Ti takes advantage of Pascal 16nm FinFet architecture, and with 12 billion transistors, 3584 shader/stream cores, and 11 GB of GDDR5X, it’s a rather impressive product. In Ultra HD it can advance up-to 15% maybe even 30% in performance over the GeForce GTX 1080 as we learned.

 

 

The GPU empowering the product is called the GP102-350-A1 GPU, which is Pascal architecture based. It has a nice 3,584 CUDA cores, while texture filtering is performed by 224 texture units. It has a base clock frequency of 1,480 MHz and performs texture filtering at 332 Gigatexels/sec.

 

 

The card has a 250 Watt TDP, 75 Watts is delivered though the PCIe slot, then 150+75 Watts through the single 6- and 8-pin PEG (PCI Express graphics) power connectors. That’s plenty spare for a nice tweak. The Pascal display engine is capable of supporting the latest high resolution displays, including 4K and 5K screens. And with HDMI 2.0b support, the Nvidia GeForce GTX 1080 Ti can be used by gamers who want to game on the newest state-of-the-art big screen TVs.

 

The card is 10.5 inches in length which is roughly 27 cm so it should fit comfortably in pretty much any decent chassis. The cooler of the 1080 Ti features that all familiar vapor chamber cooling. On the inside you’ll spot a copper vapor chamber, which is able to draw more heat off the GPU and components on the PCB including memory and VRM, ultimately allowing the GPU to run cooler and thus boost to higher clock speeds.

 

This vapor chamber is combined with a large, dual-slot aluminum heatsink to dissipate heat off the chip. A blower style fan then exhausts this hot air through the back of the graphics card which is exhausted at the exit to the left.

 


A 7+2 phase power supply is responsible for supplying the GP102 GPU with power, actually a 7-phase 2x dual-FET power design that’s capable of supplying up to 250 amps of power to the GPU. The two additional power phases are dedicated to the board’s GDDR5X memory.

You will have some extra power allowance, the reference/Founders board design supplies the GPU with 250 Watts of power at the maximum power target setting of 120%. The board uses polarized capacitors (POSCAPS) to minimize unwanted board noise, as well as molded inductors.

The New Pascal Based GPUs

The GeForce GTX 1000 series graphics cards are based on the latest iteration of GPU architecture called Pascal (named after the famous mathematician), the GeForce GTX 1080 Ti uses revision A1 of GP102-350.

  • Pascal Architecture – The Nvidia Pascal architecture is the most powerful GPU design ever built. Comprised of 12 billion transistors and including 3,584 single-precision CUDA cores, the card is the world’s fastest consumer GPU.
  • 16nm FinFET – The GP102 GPU is fabricated using a new 16nm FinFET manufacturing process that allows the chip to be built with more transistors, ultimately enabling new GPU features, higher performance, and improved power efficiency.
  • GDDR5X Memory – GDDR5X provides a significant memory bandwidth improvement over the GDDR5 memory that was used previously in NVIDIA’s flagship GeForce GTX GPUs. Running at a data rate of 11 Gbps, the 1080 Ti’s 352-bit memory interface provides way more memory bandwidth than NVIDIA’s prior GeForce GTX 980 GPU. Combined with architectural improvements in memory compression, the total effective memory bandwidth increase compared to GTX 980 is 1.8x.

The rectangular die of the GP102 was measured at close to 471 mm² in a BGA package which houses a transistor-count of well over 12 billion. Pascal GPUs are fabbed by the Taiwan Semiconductor Manufacturing Company (TSMC) in the 16nm node.

NVIDIA GeForce GTX 1080 Ti

Alright, we are stepping back to reference material for a second here. The GeForce GTX 1080 Ti gets a shader processor count of 3,584 shader processors. This product is pretty slick as it can manage clock frequencies that are really high, whilst sticking to a 250 Watt TDP. The GeForce GTX 1080 Ti is the card that comes fitted with fast pace 11 Gbps GDDR5X-memory, and sure, a weird 11 GB of it. The reference cards have a base-clock of 1.48 GHz with a boost clock of 1.58 GHz.

Reference GeForce GTX 1080 Ti Titan X GTX 1080 GTX 1070 GTX 1060 GTX 980 Ti
(2016 edition)
GPU GP102-350-A1 GP102-400-A1 GP104-400-A1 GP104-200-A1 GP106-400-A1 GM200
Architecture Pascal Pascal Pascal Pascal Pascal Maxwell
Transistor count 12 Billion 12 Billion 7.2 Billion 7.2 Billion 4.4 Billion 8 Billion
Fabrication Node TSMC 16 nm TSMC 16 nm TSMC 16 nm TSMC 16 nm TSMC 16 nm TSMC 28 nm
CUDA Cores 3,584 3,584 2,560 1,920 1,280 2,816
SMMs / SMXs 28 28 20 15 10 22
ROPs 88 96 64 64 48 96
GPU Clock Core 1,480 1,417 MHz 1,607 MHz 1,506 MHz 1,506 MHz 1,002 MHz
GPU Boost clock 1,582 1,531 MHz 1,733 MHz 1,683 MHz 1,709 MHz 1,076 MHz
Memory Clock 2752 MHz 2500 MHz 1,250 MHz 2,000 MHz 2,000 MHz 1,753 MHz
Memory Size 11 GB 12 GB 8 GB 8 GB 3 GB / 6 GB 6 GB
Memory Bus 352-bit 384-bit 256-bit 256-bit 192-bit 384-bit
Memory Bandwidth 484 GB/s 480 GB/s 320 GB/s 256 GB/s 192 GB/s 337 GB/s
FP Performance 11.5 TFLOPS 11.0 TFLOPS 9.0 TFLOPS 6.45 TFLOPS 4.61 TFLOPS 6.4 TFLOPS
GPU Thermal Threshold 91 Degrees C 91 Degrees C 94 Degrees C 94 Degrees C 94 Degrees C 91 Degrees C
TDP 250 Watts 250 Watts 180 Watts 150 Watts 120 Watts 250 Watts
Launch MSRP ref $699 $1200 $599 $379/$449 $249/$299 $699

The reference 1080 Ti is capable of achieving 11.5 TFLOP/sec of Single Precision performance. To compare it a little, a reference design GeForce GTX 980 pushes 4.6 TFLOPS and a 980 Ti can push 6.4 TFLOPS, – a GTX 1060 does 4.6 TFLOP/sec. The change in shader amount is among the biggest differentials together with ROP, TMU count and memory tied to it. The product is obviously PCI-Express 3.0 compatible, it has a max TDP of around 250 Watts with a typical idle power draw of 5 to 10 Watts. That TDP is a maximum overall, and on average your GPU will not consume that amount of power. So during gaming that average will be lower. The Founders Edition cards run cool and silent enough. Pascal GPUs are fabbed by the Taiwan Semiconductor Manufacturing Company (TSMC) at 16nm FinFET.

You will have noticed the two memory types used in the 1050/1060/1070/1080/1080 Ti and Titan X range already which can be a bit confusing. What was interesting to see was another development, slowly but steadily graphics card manufacturers want to move to HBM memory, stacked High Bandwidth Memory that they can place on-die (close to the GPU die). HBM revision 1 however is limited to four stacks of 1 GB, thus if used you’d only see 4 GB graphics cards. HBM2 can go to 8 GB and 16 GB, however that production process is just not yet ready and/or affordable enough for volume production. With HBM2 being an expensive and limited one it’s simply not the right time to make the move; Big Pascal whenever it releases to the consumer in, say, some sort of Titan or Ti edition will get HBM2 memory, 16 GB of it separated over 4 stacks. But we do not see Big Pascal with HBM2 launching anytime sooner than Q3 of 2017. So with HBM/HBM2 out of the running, basically there are two solutions left, go with traditional GDDR5 memory or make use of GDDR5X, let’s call that turbo GDDR5.

  • Nvidia in fact opted for both, the GeForce GTX 1060 and 1070 are to be fitted with your “regular” GDDR5 memory.
  • The GeForce GTX 1080, 1080 Ti and Titan X have a little extra bite in bandwidth as they will be fitted with Micron’s all new GDDR5X memory.

So, the GeForce GTX Ti is tied to 11GB GDDR5X DRAM memory. You can look at GDDR5X memory chips as your normal GDDR5 memory however, opposed to delivering 32 byte/access to the memory cells, this is doubled up to 64 byte/access. And that in theory could double up graphics card memory bandwidth, Pascal certainly likes large quantities of memory bandwidth to do its thing in. Nvidia states it to be 352-bit GDDR5X @ 11 Gbps (which is an effective data-rate).

Display Connectivity

Nvidia’s Pascal generation products will receive a nice upgrade in terms of monitor connectivity. First off, the cards will get three DisplayPort connectors, one HDMI connector and a DVI connector. The days of Ultra High resolution displays are here, Nvidia is adapting to it. The HDMI connector is HDMI 2.0 revision b which enables:

  • Transmission of High Dynamic Range (HDR) video
  • Bandwidth up to 18 Gbps
  • 4K@50/60 (2160p), which is 4 times the clarity of 1080p/60 video resolution
  • Up to 32 audio channels for a multi-dimensional immersive audio experience

DisplayPort wise compatibility has shifted upwards to DP 1.4 which provides 8.1 Gbps of bandwidth per lane and offers better color support using Display Stream Compression (DSC), a “visually lossless” form of compression that VESA says “enables up to 3:1 compression ratio.” DisplayPort 1.4 can drive 60 Hz 8K displays and 120 Hz 4K displays with HDR “deep color.” DP 1.4 also supports:

  • Forward Error Correction: FEC, which overlays the DSC 1.2 transport, addresses the transport error resiliency needed for compressed video transport to external displays.
  • HDR meta transport: HDR meta transport uses the “secondary data packet” transport inherent in the DisplayPort standard to provide support for the current CTA 861.3 standard, which is useful for DP to HDMI 2.0a protocol conversion, among other examples. It also offers a flexible metadata packet transport to support future dynamic HDR standards.
  • Expanded audio transport: This spec extension covers capabilities such as 32 audio channels, 1536kHz sample rate, and inclusion of all known audio formats.

High Dynamic Range (HDR) Display Compatibility

Nvidia obviously can now fully support HDR and deep color all the way. HDR is becoming a big thing, especially for the movie aficionados. Think better pixels, a wider color space, more contrast and more interesting content on that screen of yours. We’ve seen some demos on HDR screens, and it is pretty darn impressive to be honest. By this year you will see the first HDR compatible Ultra HD TVs, and then next year likely monitors and games supporting it properly. HDR is the buzz-word for 2016. With Ultra HD Blu-ray just being released in Q1 2016 there will be a much welcomed feature, HDR. HDR will increase the strength of light in terms of brightness. High-dynamic-range rendering (HDRR or HDR rendering), also known as high-dynamic-range lighting, is the rendering of computer graphics scenes by using lighting calculations done in a larger dynamic range. This allows preservation of details that may be lost due to limiting contrast ratios. Video games and computer-generated movies and special effects benefit from this as it creates more realistic scenes than with the more simplistic lighting models used. With HDR you should remember three things: bright things can be really bright, dark things can be really dark, and details can be seen in both. High-dynamic-range will reproduce a greater dynamic range of luminosity than is possible with digital imaging. We measure this in Nits, and the number of Nits for UHD screens and monitors is going up. What’s a nit? Candle brightness measured over one meter is 1 nits, the sun is 16,000,000,000 nits, typical objects have 1~250 nits, current PC displays have 1 to 250 nits, and excellent HDTVs have 350 to 400 nits. A HDR OLED screen is capable of 500 nits and here it’ll get more important, new screens in 2016 will go to 1,000 nits. HDR offers high nits values to be used.  We think HDR will be implemented in 2016 for PC gaming, Hollywood has already got end-to-end access content ready of course. As consumers start to demand higher-quality monitors, HDR technology is emerging to set an excitingly high bar for overall display quality. HDR panels are characterized by: Brightness between 600-1200 cd/m2 of luminance, with an industry goal to reach 2,000 contrast ratios that closely mirror human visual sensitivity to contrast (SMPTE 2084) And the Rec.2020 color gamut that can produce over 1 billion colors at 10 bits per color HDR can represent a greater range of luminance levels than can be achieved using more “traditional” methods, such as many real-world scenes containing very bright, direct sunlight to extreme shade, or very faint nebulae. HDR displays can be designed with the deep black depth of OLED (black is zero, the pixel is disabled), or the vivid brightness of local dimming LCD. Now meanwhile, if you cannot wait to play games in HDR and did purchase a HDR HDTV this year, you could stream it. A HDR game rendered on your PC with a Pascal GPU can be streamed towards your Nvidia Shield Android TV and then over HDMI connect to that HDR telly as Pascal has support for 10 bit HEVC HDR encoding and the Shield Android TV can decode it. Hey, just sayin’. A selection of Ultra HDTVs are already available, and consumer monitors are expected to reach the market late 2016 and 2017. Such displays will offer unrivaled color accuracy, saturation, brightness, and black depth – in short, they will come very close to simulating the real world.

Conclusion

Um yea,.. it’s fast! It has been an interesting half year alright. Titan X has been dominating the charts, everybody also expected VEGA from team red, but their release is still pending. Meanwhile Nvidia was ready with the 1080 Ti and I literally do believe that they did not want to delay the launch any longer (to wait and adapt their perf config to Vega). It is now March 2017 and Nvidia figured to just let the beast out, and it is as good as we expected it to be. But yes, that 11 GB is weird. Totally.

It is my belief that Nvidia wanted to keep the price down and trade 1GB for 10% faster memory. Then they tweaked this SKU a tiny bit on base and turbo clock frequencies and there you have it, a product hovering at a Titan X performance level, sometimes a notch faster but mostly really close to it. The irony here is that today we tested the reference model, historically the slowest part. You can rest assured that board partner models are going to be clocked in the 1600 MHz base-clock domain with Boost frequencies in the 1800~1900 MHz range. So yes, there is another 10 maybe 20% performance left in the 1080 Ti which the board partners will completely tailor-fit to you preference. Next to that, the coolers from AIB partners will be better (we assume) as well. The current reference model is throttling, it’ll quickly reach 84 Degrees C after which the GPU will start to downclock a bit. If the AIB partners can keep the 1080 Ti 10 Degrees C lower, then yes, there will be some performance gains to be made there as well. But even with a bit of throttling, the GeForce GTX 1080 Ti FE is a beast. Heck, this card sooo belongs in a gamer’s rig, the ones that game at Ultra HD will see tremendous performance and a thrilling experience!

Aesthetics

For the GeForce GTX 1080 Ti series Nvidia tweaked the aesthetics of the cooler a bit more in line with the series GeForce GTX 1000 cards. So I think everybody will agree with me, it’s just a normal looking Nvidia reference product. Some will dislike the fact that it does have a completely closed back-plate though (including myself). I remain skeptical about back-plates, they can trap heat and thus warm up the PCB. But the flip-side is that they do look better and can protect your PCB and components from damage. The overall looks are fine though.

Cooling & Noise Levels

The cooling itself really is at the same level it was, you can’t really complain about it but it’s not hugely impressive either. A bit trivial of course remains the temperature targets that Nvidia is using. The default setting for the 1080 Ti will be ~80 degrees C, meaning the card is allowed to run at roughly 80 Degrees C before ramping up the fan RPM or clocking down. Nvidia feels this is a nice balance in-between performance, power consumption and temperatures. The downside; above that 80 Degrees C the GPU (and this does happen with this card) will adapt and throttle down to be able to meet that temperature target. The Boost frequency however remains above that advertised, we’ve seen it in the 1700 MHz range when throttling. You can obviously change the temperature target of the fan RPM yourself, that will ramp up the noise levels badly. But at roughly 80 degrees C the noise levels are okay yet not silent. In idle you can barely hear the cooling solution (the fan does not turn off though) and under stress, well, you can hear airflow. That 80~84 degrees C range throughout the board, remains acceptable. On throttling, it does not have a huge impact. But with better cooling the card could be a notch faster alright. So the cold hard fact remains that this cooler is holding back the GPU a bit. The board partners will love to sink their teeth into this though.

 

 

Power Consumption

The GP102-350-A1 Pascal GPU is rated as having a 250 Watt TDP, our measurements show numbers slightly above that, keep in mind that this is a peak maximum value under full stress. At this performance level you are looking at 400~500 Watts for the PC in total, that is okay. We think a 600~650 Watt PSU would be sufficient and, if you go with 2-way SLI, an 800~900 Watt power supply is recommended. Remember, when purchasing a PSU aim to double up in Wattage as your PSU is most efficient when it is under 50% load. Here again keep in mind we measure peak power consumption, the average power consumption is a good notch lower depending on GPU utilization. Also, if you plan to overclock the CPU/memory and/or GPU with added voltage, please do purchase a power supply with enough reserve. People often underestimate it, but if you tweak all three aforementioned variables, you can easily add 200 Watts to your peak power consumption budget.

Gaming Performance

Now, I did not want to include 1080P results as I figured CPU limitation would be horrendously bad, well heck… it’s not that bad TBH. From 1080P to Ultra HD the GeForce GTX 1080 Ti shows some serious numbers. But here’s a paradox – the more difficult things get – the better the product will perform. E.g. Ultra HD is its true domain. Much like fine wine that ages well, that means this GeForce GTX 1080 Ti will last you a long time with future more GPU intensive games. This much performance and 11 GB of GDDR5X graphics memory helps you out in Ultra HD, DSR, VR and hefty complex anti-aliasing modes. That and of course the latest gaming titles. I consider this to be a very viable single GPU solution that allows you to game properly in Ultra HD with some very nice eye candy enabled with a single GPU. Drivers wise we can’t complain at all, we did not stumble into any issues. Performance wise, really there’s not one game that won’t run seriously well at the very best image quality settings. Gaming you must do with a nice Ultra HD monitor of course, or at least a 2560×1440 screen. Now, we can discuss the advantages of that 11 GB frame-buffer, but hey, you can draw your own conclusions there as performance isn’ t limited. And with 11 GB of it, you won’t run out of graphics memory for the years to come, right? So in that respect the card is rather future proof.

No FCAT frametimes this round, the DVI converter has issues with the FCAT setup. We’ll post a followup on FCAT with an AIB card that comes with proper DVI connector.

Overclocking

The boost modes can be configured with temperature targets relative to maximum power draw and your GPU Core frequency offsets. Saying that I realize it sounds complicated, but you’ll have your things balanced out quite fast as these products are easy to tweak. This GPU can take a rather hefty Boost clock, once tweaked (start with +150 MHz on the GPU and a 120% power limiter) and you’ll see your games rendering in the 1900~2000 MHz domain. The memory you’ll be able to get close to roughly 11.5~12.0 GHz effective. For a GPU with 12 Billion transistors, you’ve gotta be at least a tiny bit impressed, right? With our tweak we averaged out at a ~2 GHz boost frequency, but had to increase fan RPM to prevent it from throttling down.

Concluding

The GeForce GTX 1080 Ti is the Titan X in disguise. Nvidia had to do something to it and decided to ditch 1GB of memory, bringing that VRAM number to a weird 11GB. This means slightly fewer ROPs and a 352-bit memory bus as well. But then they do use faster DDR5X memory and slightly faster than Titan X clock frequencies. So the performance drop is immediately annihilated and in fact the GeForce GTX 1080 Ti is as fast or sometimes even faster than the Titan X. And yeah, it might cost a tremendous 699 USD / 820 Euro, but you surely are receiving an exorbitant amount of performance for the exorbitant amount of money. And that’s just the hardcore reality for this initial reference model, gawd only knows what the factory tweaked and cooled AIB cards from MSI, ASUS, Gigabyte, ZOTAC, Palit and all others will do. It could add another 10 to 20% performance for the higher binned SKUs, and that is bat**** crazy yo! Now let me bring you back to reality for a moment, if you are a 1080P or even 1440P gamer, you are probably and economically better off with a GeForce GTX 1070 or 1080, really. These cards just make the most sense. But then again, you’ve seen the numbers, for Ultra HD gamers and even 2560×1440 gamers this product works out well, really well. Overall we are impressed by the GeForce GTX 1080 Ti, very much so. The cooler is fighting that massive GPU though, it will throttle a bit under massive loads but that remains acceptable and throttling stays above the advertised Boost ranges. The NVIDIA GeForce GTX 1080 Ti is available starting from NVIDIA in their on-line web-shop, in a week or two/three we expect AIB partner cards with their customized models. And that’s where things will get really serious. You name your game, go WQHD or Ultra HD and play your games, this graphics card has tremendous performance capabilities. The price is the one thing that remains a problematic thing for many, but this is Nvidia’s fastest consumer based graphics card to date. If you can afford it, we can definitely recommend it. But my man, I cannot wait to see what the board partners are going to do with this graphics processor series!

GET GeForce GTX 1080 Ti From Amazon

Article Source: Nvidia, guru3d

Leave a Reply

Your email address will not be published. Required fields are marked *

H_yXohEouL43gA" />