Page 1 of 6 1 2 3 ... LastLast
Results 1 to 10 of 52
  1. #1
    Administrator Array Silent Scone@ASUS's Avatar
    Join Date
    Mar 2016
    Reputation
    64
    Posts
    1,414

    Strix 1080Ti Overclocking Guide


    ASUS Strix 1080TI Overclocking Overview


    It’s no secret that the NVIDIA Pascal architecture is monumentally fast. In the last 3 years alone, the expectation we have of real-time rendering has grown exponentially. With faster hardware available to the masses, games developers are free to explore vastly more resource intensive visual effects. Of course, there are many cogs in the well-oiled machine that is the PC gaming industry, but without the hardware to drive it none of it would be possible.

    Unlike the 1080GTX, the 1080Ti is based on big Pascal (GP102). Sporting the same 3584 CUDA cores and an even higher boost clock renders the Titan X obsolete, putting the Ti firmly in the middle, just below the final full-fat iteration Titan Xp (3840 CUDA).
    In an attempt to not tread on the Titan Xp's toes, NVIDIA have clipped the Ti's memory bus with 11GB of GDDR5X instead of the 12GB found on the former. In terms of capacity, most of us who are familiar with modern demand will know both amounts are ample for gaming tasks, omitting one or two eccentric scenarios. The Titan brand is also an NVIDIA exclusive one, meaning it does not benefit from the refined cooling and design that the AIB market usually offers.

    Let's up the ante

    As enthusiasts, it's in our nature to want to get the most out of the hardware, and maintaining a constant frame rate is undeniably the key. Advancements in display technology such as UHD and VR put the onus on ultra-low latency. And that means pushing a high-end GPU is no longer something just for the diehard, it's become a necessity to ring out those extra frames to hit the target set by high-refresh displays. Paul Engemann’s ‘Push It to the Limit’ should be playing in the background at this point for maximum effect.

    The ASUS 1080Ti ROG Strix is comprised of a custom PCB with a high-quality VRM (10+2 phases vs 7 on the FE edition) for clean, high-current power delivery, as well as two 8 pin power connectors to ensure ample supply. A That’s topped with a brand new 2.5 slot cooling design, which nets you 1683MHz out of the box compared to the Founder Edition’s 1,582MHz. Moreover, we can raise this further once in the card’s OC mode to 1,708MHz. With this in hand, you’re already off to a good start if fluidity is the name of the game.
    The enhanced cooler result in a more composed environment for the GPU cores. If you've lived with NVIDIA's vapour chamber cooler for any length of time, you'll understand just how appealing a capable thermal solution is when looking for an air-cooled card. In fact, the Strix’s triple wing-blade fan design remains entirely passive until reaching 55c.

    Click image for larger version. 

Name:	VRM.jpg 
Views:	6 
Size:	236.4 KB 
ID:	64884

    A quick perambulation

    Dubbed MaxContact technology, the all-new 2.5 slot heat spreader design has 40% more surface coverage than the previous DirectCU offering.

    On top, three Wing-Blade fans with IPX5 dust resistance for longer life operation.

    Click image for larger version. 

Name:	885Z6946.JPG 
Views:	8 
Size:	821.4 KB 
ID:	64887

    ASUS FanConnect II allows us to connect two additional PWM or DC fans. These can either be controlled automatically in sync with the GPU fans, or manually via Fan Xpert. A welcoming addition if wanting to control chassis intake and exhaust fan speeds based on GPU temperature. The RGB header also makes a return, allowing us to sync lighting effects with other Aura Sync capable hardware.

    Unlike the Founders Edition, the Strix includes a dual-link DVI for those who were disappointed in its absence from other models. Sitting alongside are two HDMI 2.0b connectors, and three DisplayPort 1.2a, giving us the capability to drive up to 4 displays.

    Click image for larger version. 

Name:	885Z7019.JPG 
Views:	7 
Size:	569.8 KB 
ID:	64885

    Having quiet cooling is one thing, but the concrete alloy chokes help to reduce other sources of noise. Speaking personally, I've had the card on an open chassis sitting right next to me. Compared to the Founders Edition, the Strix is certainly less audible underload in this regard, also.

    NVIDIA’s greenlight induces voltage limits that prevent us subjecting our cards to high voltage beyond what they deem is to be safe for a long lifespan. Ultimately, this means when you're looking for the best all round solution, thermals and noise are the most important. When we breach the thermal target our effective boost clock is reduced, and as a direct result so is performance.


    Room for improvement

    With introductions out of the way, the fun can start. Installing GPU Tweak II, we'll set out to see what's to be gained from overclocking.
    The GPU Tweak II UI is intuitive and easy to navigate. By default, the UI is left in Simple Mode. This gives you the GPU speed increase over stock, GPU temperature, and Vbuffer usage. The conventional profiles are laid out here, too.

    Click image for larger version. 

Name:	SIMPLE UI screengrab.PNG 
Views:	6 
Size:	76.3 KB 
ID:	64880

    OC Mode: This mode takes the card to 1708MHz whilst increasing our power target to 110%. If wanting the most aggressive performance whilst jumping straight into games, this is the profile for you.

    Gaming Mode: This is the Strix default pre-set. The GPU clock speed will run at 1683MHz.

    Silent Mode: In this mode the GPU clock speed will run at 1658MHz, with a reduced power target of 90%.

    My Profile:
    A quick select for your own predefined profiles, something we will be exploring shortly.

    Game Booster: When performance is key, having anything unnecessary running in the background is frowned upon. This option allows us to automatically adjust Windows visual appearance for best performance, as well as turn off unnecessary Windows services that may be taking up resources.

    0dB Fan: When paired with the Strix or other compatible cards, this button when enabled allows the card to remain passive until upon reaching 55c.

    Info: Where you can find information about the 1080Ti as well as Tweak II

    Tools: From here we can install Aura Graphics, which gives us full control of the RGB lighting on the Strix. You can either run this independently, or install the regular Aura software to sync lighting with compatible hardware.

    Xsplit Gamecaster : Allows both streaming and recording through an intuitive interface which includes an FPS and VRAM overlay - something that GPU Tweak currently lacks (although Rivatuner is arguably still the application of choice for that purpose). In addition, it can change GPU Tweak profiles on –the-fly. Having used Xsplit now for a week or so, I have no complaints. Personally, I think it feels less intrusive than other alternatives such as NVIDIA Share. However, the snag is the free version limits recording to 720p and 30fps.


    Professional Mode: This is where the magic happens. Some of you may be familiar with the interface already and simply want to get down to the number crunching. However, those new to GPU Tweak II should read on.

    Click image for larger version. 

Name:	Stable Overclock Settings.PNG 
Views:	11 
Size:	717.5 KB 
ID:	64881

    Monitor: Here we can keep an eye on the state of things. For the first few hours, you might find you're looking to this window a great deal. By clicking the expand button, we can either discard items from the window we do not wish to use or rearrange them. Logging is also available if you wish to look back over any of these statistics.

    Profiles:
    Here we can select from the predefined profiles, or create and save our own after establishing an overclock

    GPU Boost Clock: Sets the offset for our GPU boost clock. The maximum clock at a set frequency will depend on our temperature and power settings. This means that the frequency we see under our applied offset is almost certainly not what we will see under load. At stock, expect closer to 1950MHz on the Strix. Far higher than the touted boost specs.

    GPU Voltage: On Pascal, voltage increase is expressed on a percent scale based on multiple voltage points. By default, the upper voltage points are locked. Once we increase the voltage offset, we unlock these upper voltage points, giving us some additional headroom. Pascal does not respond particularly well to voltage in terms of obtainable clocks, however, once we increase core speeds certain heavier scenarios may induce a voltage limit. This means GPU Boost 3.0 has reached the maximum performance for the default or set voltage. For this reason, we are better off leaving this at 100%. Obligatory disclaimers with overclocking apply here, although there are throttling and amperage measures in place, one still needs to be aware that overclocking voids your warranty.

    Memory Clock: This controls the offset for the memory clock. Although performance gains are limited, the new GDDR5X IC can stretch its legs. My sample can achieve 6000MHz. Memory bandwidth can have more impact when using heavy anti-aliasing techniques and higher resolutions, however, best not to forget that core frequency is still king.

    Memory related instability will normally manifest in an application or system hang. Like with tuning other aspects of the system, it’s best to overclock one thing at a time to avoid any red herrings and confusion.

    Fan Speed:
    Automatic or manual control of the Wing Blade fans. The default profile keeps the GPU cool even when overclocking. I've yet to see the card exceed 70C. That said, if cool as possible is the name of the game , we can set a fixed fan-speed or adjust the curve. Noise levels are entirely subjective, and the Strix is no different. You must see what works best for you.

    Power Target: This allows us to increase the maximum power draw. Even if not looking to overclock the card, simply increasing the power target can net some additional performance. You will hit the power limit on the 1080Ti long before hitting a thermal one thanks to the Strix cooler - even when not overclocked. I'd recommend increasing this to 120% before starting.

    FPS Target: No real introduction needed. With this set, the card will limit itself to the desired frame rate. Doing so can reduce temperature, and as a direct result help GPU Boost 3.0 to maintain a higher boost clock.

    GPU Temp Target: This setting controls what temperature the card will maintain. Because of the Strix’s huge cooling capacity, we can safely set the priority to the power target. With the Strix cooler, load temps generally fall between 50-65c under load (with 21C ambient temps), which is well within the safe zone. Pascal also responds well to lower temperatures:



    With a quick adjust of the fan curve, we can shave 10c off GPU core temperatures. Better still, with a good water-cooling loop in tandem with an EK water block, we can keep things under 35C. The benefits of doing so can be seen in the benchmarks above (temperatures taken from The Division Benchmark).

    Finding the limits

    Should go without saying, but test the GPU at stock settings before you start overclocking, to make sure everything is in order, and to gauge system performance. CPU and memory stability if overclocking should also be in check. If we’re setting out to test the stability of our GPU, having other sub-systems unstable has the potential to create confusion if experiencing instability during the process.

    Power Target: Raise the power target to 120%, so that we have the highest TDP limit available.

    Core +70 Offset (1753MHz): Finding the max core frequency is straightforward, and most samples will do within 100MHz of each other. We will start by applying a +70 MHz offset. Once we start testing the card, if any crashing occurs, we can back off the offset by -10 MHz each time. No harm will come to the GPU; the driver will simply recover and we can try again. In the event the system doesn't recover; we must reboot. Continue to increase the offset by +10MHz every time until we experience instability.

    Memory: +200 Offset (11210MHz): Most 1080Ti’s should be able to achieve this. This can be kept conservative for the time being; the gains are limited, and we don't want to produce any red herrings with unstable memory.

    GPU Voltage 100%: Moving the voltage slider to 100% avoids hitting voltage limits when finding the max frequency. *

    Custom fan curve (room/case ambient 20c): The screenshot below shows how I have set my fan curve. The Strix runs exceptionally cool even on the default curve. However, by increasing things slightly we can keep the card even cooler without much intrusion from the fans. The fans become audible at around 55%, but you will need to see what works best in your environment. If you prefer, you can also set the fan to 100% to try and find maximum attainable frequency, however, this isn't practical if noise is a concern.

    Click image for larger version. 

Name:	Strix OC Guide_html_m34260ae0.png 
Views:	7 
Size:	42.0 KB 
ID:	64901

    Put it to the test

    3DMark's stress test feature lets us loop the initial graphics test up to 20 times, which takes roughly 10 minutes. What constitutes as a pass or fail - assuming the test doesn't crash - will depend on if the card can maintain the performance throughout, which is no sweat whatsoever for the Strix cooler. I recommend using the Firestrike Extreme or Ultra settings for this (1440p and 4K respectively). The test resolution is down-sampled, so it may even be run on a 1080P panel.

    From the screenshot below, we can see that the GPU started out at 45C and finished at 64C. This is well within the cards stock temperature target. This gives us a consistent average frame rate throughout, with a pass rate of 99%, and a GPU core frequency between 2038-2050MHz (anything under 97% is automatically a fail). You can run the test at a higher core offset until you experience instability.

    Click image for larger version. 

Name:	3dmark.png 
Views:	4 
Size:	226.0 KB 
ID:	64882


    Once we've passed consistently, I recommend moving away from synthetic tests. Now, simply use the card in games. Different titles stress the GPU in different ways, so there's little point running the same benchmarks over and over to test for unconditional stability. If we experience issues, we can relax settings to dial out the instability.

    The extra mile - granular control

    For some time now, we've seen the base clock and boost clock plastered on spec sheets. This doesn't tell us the whole story, especially with Pascal. GPU Boost uses a set frequency curve based on voltage points. If we set a fixed offset using the default curve positioned by NVIDIA, we're limiting our maximum frequency due to the fact the offset is applied to every point along the curve. Whereas this might be ok for some GPU, on others the voltage may be insufficient to provide stability at every point. For example, if we’re stable at 2100MHz at the midpoint in the curve and 2050Mhz at the very top, then we’re limited to 2050MHz.

    Control of this curve has previously been gated off to the user, however, with Pascal and GPU Boost 3.0 we can manually set the frequency at these voltage points, opening the path to fine-grained tuning. Through GPU Tweak II and the user defined boost clock, we can set our core frequency at a given voltage point. For instance, my sample is comfortable with 2100MHz at around 1.035v. The catch, however, is that this method can take time. You're also not guaranteed to find any additional headroom, but if you don't try, you'll never know.

    Click image for larger version. 

Name:	voltage curve.png 
Views:	10 
Size:	66.8 KB 
ID:	64883

    What does this mean for our experience?

    It doesn’t matter what type of gamer you are; frame rate and latency are important. Whether you’re running across the war-torn fields of France wielding an M1892 rifle, or rallying through the forests of Finland in a 4-door saloon, they’re a metric for those of us who either want the best experience the industry has to offer, or a competitive edge. Whichever it is, there’s no denying that stepping up the frequency slider for that little bit extra is a beckoning call, especially when it is so easy.

    On this setup - in tandem with G-Sync - extracting performance through overclocking and keeping the GPU cool, ultimately ends up in a smoother experience. For the 3440x1440 target resolution on my personal system, one 1080Ti copes admirably thanks to G-Sync. That said, the benefit of a second card would certainly be welcome in some titles.

    Of course, most of you will be aware by now that FPS is not the only metric, and arguably not the most important. The Alienware slogan was once “FPS is life”. In the present, we now know that those numbers don't tell us the whole story. Worth pointing out these benchmarks are recorded in FRAPS, so we're seeing what's being recorded at the rendering level.
    *
    Last edited by Silent Scone@ASUS; 05-27-2017 at 08:28 AM.

  2. #2
    Administrator Array Silent Scone@ASUS's Avatar
    Join Date
    Mar 2016
    Reputation
    64
    Posts
    1,414

    Specification
    CPU: 6850K @ 4.4Ghz
    Motherboard: ASUS X99 Deluxe II
    PSU: EVGA 1200W
    RAM: GSKILL F4-3200C14Q-32GTZ
    Storage: M.2 Samsung PM961 512GB NVMe
    GPU: ASUS Strix 1080Ti Gaming

    Avg. Time (ms) = calculated average time between frames
    1% Time (ms) = difference time for the worst 1% (best 99%) frames.
    0.1% Time (ms) = difference time for the worst 0.1% (best 99.9%) frames


    Click image for larger version. 

Name:	PreyFrameTimes.png 
Views:	3 
Size:	84.1 KB 
ID:	64894
    Click image for larger version. 

Name:	Prey Frame Times.PNG 
Views:	3 
Size:	24.3 KB 
ID:	64895
    Click image for larger version. 

Name:	SNIPERELITE FRAME TIMES.png 
Views:	3 
Size:	83.5 KB 
ID:	64896
    Click image for larger version. 

Name:	Sniper Elite Frame Times.PNG 
Views:	3 
Size:	25.4 KB 
ID:	64897
    Click image for larger version. 

Name:	tomb raider frametimes.png 
Views:	2 
Size:	66.8 KB 
ID:	64898
    Click image for larger version. 

Name:	Tomb Raider Frame Times.PNG 
Views:	2 
Size:	26.0 KB 
ID:	64904


    The 1080Ti a seriously powerful card. From the results above, we can see that we're not having any difficulty hitting the refresh target of the 3440x1440 100Hz panel, and our highest recorded latencies from our minimum frame rates are far from jolting. Although these results are fairly representative of what one can expect, the experience is distinctly individual as to what gains are better felt where. Ultimately, it's free performance. That isn't something that one can say about many walks of life.
    Last edited by Silent Scone@ASUS; 05-26-2017 at 09:19 PM.

  3. #3
    ROG Member Array SlvrBullet PC Specs
    SlvrBullet PC Specs
    MotherboardAsus Maximus Hero VIII Alpha
    ProcessorIntel 7700K @ 4.6 GHz
    Memory (part number)Coprsair Vengeance LPX 32GB (2x16GB) DDR4 DRAM 2666MHz C16 Memory
    Graphics Card #1Asus 1080 Ti Strix
    Sound CardCreative Sound Blaster Z
    MonitorAsus PG348Q Ultra-wide
    Storage #1Samsung 950 Pro NvME (512GB)
    Storage #2Raid 0 Samsung 850 Pro (512GB x2)
    CPU CoolerCorsair Hydro Series H100i v2
    CaseCorsair 750D
    Power SupplyCorsair AX1200i Digital ATX
    Keyboard Corsair STRAFE Mechanical Gaming Keyboard — Cherry MX Silent
    Mouse Corsair M65 PRO RGB FPS Gaming Mouse
    Headset Seinheiser Game One
    Mouse Pad Corsair MM400 High Speed Gaming Mouse Pad
    OS Windows 10 Home
    Network RouterAsus RT-AC68U Dual-band 3x3 AC1900
    SlvrBullet's Avatar
    Join Date
    May 2017
    Reputation
    10
    Posts
    8

    Great tutorial. Obviously a lot of work put in to it. Thank you.

  4. #4
    Administrator Array Silent Scone@ASUS's Avatar
    Join Date
    Mar 2016
    Reputation
    64
    Posts
    1,414

    Quote Originally Posted by SlvrBullet View Post
    Great tutorial. Obviously a lot of work put in to it. Thank you.
    Thanks, you're most welcome

  5. #5
    Tech Marketing Manager HQ Array Raja@ASUS's Avatar
    Join Date
    Apr 2011
    Reputation
    161
    Posts
    7,373

    Made sticky. Thanks Michael.

  6. #6
    ROG Guru: White Belt Array Catalonia PC Specs
    Catalonia PC Specs
    MotherboardAsus Rampage V Eddition 10
    Processori7 6850k Broadwell-e @ 4.3GHz
    Memory (part number)G.Skill Trident Z DDR4 3200MHz CL14 32GB 4x8GB
    Graphics Card #1Asus Strix 1080 Ti OC
    MonitorAsus ROG Swift PG279Q 165Hz G-Sync IPS
    Storage #1Samsung SM961 M.2 512GB NVME
    Storage #23TB WD RED
    CPU CoolerCorsair H150i PRO
    CaseCorsair 780T Black
    Power SupplyCorsair AX1600i
    Keyboard Corsair K95 Platinum
    Mouse Logitech G903
    Headset Audio Technica DSR9BT
    Mouse Pad Corsair MM800 Polaris RGB
    OS W10
    Network RouterAsus GT-AC5300

    Join Date
    Sep 2015
    Reputation
    10
    Posts
    92

    Amazing Job.
    Thank you very much for this tutorial. I've got my Asus Strix 1080 Ti OC Edition today and I'm already playing with all this settings.

    I'm comming from an Aorus 1080 Ti Xtreme edition, and tweaked with Afterburner, I was able to reach 10k in 3D-Mark at 1440p. Now with my Strix I'm unable to reach more than 9750 points. I'm trying to understand why I'm having such a performance drop compared to the Arous. And I'm currently having a strong OC applied:

    Click image for larger version. 

Name:	u9Ao8O7AReGTie5wfHDvfQ.png 
Views:	0 
Size:	100.2 KB 
ID:	66071

    Will keep trying. Thank you again!
    Miniatura de Adjuntos Miniatura de Adjuntos u9Ao8O7AReGTie5wfHDvfQ.png  


  7. #7
    Administrator Array Silent Scone@ASUS's Avatar
    Join Date
    Mar 2016
    Reputation
    64
    Posts
    1,414

    Quote Originally Posted by Catalonia View Post
    Amazing Job.
    Thank you very much for this tutorial. I've got my Asus Strix 1080 Ti OC Edition today and I'm already playing with all this settings.

    I'm comming from an Aorus 1080 Ti Xtreme edition, and tweaked with Afterburner, I was able to reach 10k in 3D-Mark at 1440p. Now with my Strix I'm unable to reach more than 9750 points. I'm trying to understand why I'm having such a performance drop compared to the Arous. And I'm currently having a strong OC applied:

    Click image for larger version. 

Name:	u9Ao8O7AReGTie5wfHDvfQ.png 
Views:	0 
Size:	100.2 KB 
ID:	66071

    Will keep trying. Thank you again!
    Thanks,

    Hard to say without knowing the frequencies. You can try adjusting the voltage curve manually if hitting a wall.

  8. #8
    New ROGer Array
    Join Date
    Sep 2017
    Reputation
    10
    Posts
    3

    Hello, I got my Strix 1080 Ti (non-OC) about a month ago, which replaced my old GTX 770 and I tried to overclock my CPU for the first time ever using your guide. First of all a big thank you for the work you've put in.

    I wonder about two things, maybe you or someone else here could help me out. I'm an absolute rookie in GPU overclocks.

    1. My standard clock difffers from yours. You start with 1683 Mhz and make a +70 offset for your first step. I start with 1607 Mhz and couldn't take my clock to 1753 Mhz. 1720 Mhz seem's to be the absolute limit. I used Time Spy benchmark from 3DMark and it says it reached 1987 Mhz on the top.
    2. The GPU Voltage has been upped to 100% by myself as you said in the tutorial. Should this be taken down to lower values to use less power? Like the vcore of a CPU?

  9. #9
    ROG Enthusiast Array
    Join Date
    Jul 2016
    Reputation
    10
    Posts
    36

    @JayvH: Guess you and me didnt win the sillicon race.

    My card is at +52 GPU (1735 / 1974 boosted) and GPU Voltage 0, Memory Clock +62, Power target +120. Still looking for the limit. When i tried boosting my GPU above +55/60 the driver started to crash. Doing by steps now.
    Last edited by Oxizee; 09-26-2017 at 08:26 PM.

  10. #10
    Administrator Array Silent Scone@ASUS's Avatar
    Join Date
    Mar 2016
    Reputation
    64
    Posts
    1,414

    Quote Originally Posted by JayvH View Post
    Hello, I got my Strix 1080 Ti (non-OC) about a month ago, which replaced my old GTX 770 and I tried to overclock my CPU for the first time ever using your guide. First of all a big thank you for the work you've put in.

    I wonder about two things, maybe you or someone else here could help me out. I'm an absolute rookie in GPU overclocks.

    1. My standard clock difffers from yours. You start with 1683 Mhz and make a +70 offset for your first step. I start with 1607 Mhz and couldn't take my clock to 1753 Mhz. 1720 Mhz seem's to be the absolute limit. I used Time Spy benchmark from 3DMark and it says it reached 1987 Mhz on the top.
    2. The GPU Voltage has been upped to 100% by myself as you said in the tutorial. Should this be taken down to lower values to use less power? Like the vcore of a CPU?
    1) 1683Mhz is the default boost clock whilst the card is in Gaming mode. The maximum reported clocks you're seeing in 3DMark are the boost clock peak.


    2) Sometimes, setting a lower maximum voltage value can help to avoid hitting the power limit 'early'. This is why customising the voltage curve can be beneficial on some cards.
    Last edited by Silent Scone@ASUS; 09-26-2017 at 12:51 PM.

Page 1 of 6 1 2 3 ... LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •