Page 2 of 2 FirstFirst 1 2
Results 11 to 18 of 18
  1. #11
    ROG Enthusiast Array
    Join Date
    Nov 2013
    Reputation
    10
    Posts
    67

    G-Sync is just gouging the market until the VESA standard comes out.
    But it will not matter to me, the small improvement in smoothness is not as beneficial as 4K resolution. I have played both and with enough GPU power (two 290x or better) 4K wins.
    Give it a try, when you see your opponent from a farther distance you have more advantage than almost imperceptibly smoother game play.

  2. #12
    ROG Guru: Yellow Belt Array pennyboy PC Specs
    pennyboy PC Specs
    MotherboardASUS Maximus XI Hero
    ProcessorIntel i9 9900k
    Memory (part number)G.Skill 32gb Trident Z RGB
    Graphics Card #1EVGA XC Ultra GeForce RTX 2080 Ti
    Sound CardAsus Supreme FX Onboard
    MonitorASUS PG278Q Swift 27" GSYNC
    Storage #1Samsung 970 Evo 1 tb
    Storage #2Samsung 970 Evo 500 gb
    CPU CoolerCoolerMaster 360ML RGB
    CaseCorsair 900D
    Power SupplyCorsair 1200 AXi
    Keyboard Razer BlackWidow Chroma
    Mouse Razer Diamondback Chroma
    Headset Razer Kraken Chroma
    Mouse Pad Razer Scarab - Battlefield 3 theme
    Headset/Speakers Logitech Z-680
    OS Windows 10 x64 1809
    pennyboy's Avatar
    Join Date
    Sep 2013
    Reputation
    10
    Posts
    173

    Quote Originally Posted by Duke of Alinor View Post
    G-Sync is just gouging the market until the VESA standard comes out.
    But it will not matter to me, the small improvement in smoothness is not as beneficial as 4K resolution. I have played both and with enough GPU power (two 290x or better) 4K wins.
    Give it a try, when you see your opponent from a farther distance you have more advantage than almost imperceptibly smoother game play.
    Hmm interesting you prefer 4k, for me I far prefer smoothness over 4k!!

  3. #13
    ROG Guru: Yellow Belt Array Volt PC Specs
    Volt PC Specs
    MotherboardGIGABYTE G1.Sniper Z97 ATX
    Processori7-4790K 4.0 GHz
    Memory (part number)ADATA 32GB (8GBx4) DDR3/1600MHz
    Graphics Card #1nVidia Gefroce GTX 1080 w/8 GB
    Sound CardMotherboard G1 - Creative Sound Core3D
    MonitorASUS Rog PG348Q Swift
    Storage #1250 GB Samsung 850 Pro
    Storage #22TB Western Digital Caviar Black SATA-III
    CPU CoolerAsetek 550LC 120mm Liquid Cooling
    CaseCorsair Obsidian 750D
    Keyboard Corsair K70 STRAFE
    Mouse Logitech Pro
    OS Win 10
    Network RouterASUS RT-AC88U
    Volt's Avatar
    Join Date
    Nov 2014
    Reputation
    12
    Posts
    180

    Duke and pennyboy have good points. You really should try it out, and see for yourself, which you like better. It's a subjective thing, as to which you'll like better.

    Although 4k + gysnc would be even better!

    ASUS announced at CES last month it's coming out with a 27" 4k/60Hz (max refresh rate) g-sync IPS monitor.

    http://rog.asus.com/393642015/gaming...cd-with-gsync/

  4. #14
    ROG Guru: Yellow Belt Array pennyboy PC Specs
    pennyboy PC Specs
    MotherboardASUS Maximus XI Hero
    ProcessorIntel i9 9900k
    Memory (part number)G.Skill 32gb Trident Z RGB
    Graphics Card #1EVGA XC Ultra GeForce RTX 2080 Ti
    Sound CardAsus Supreme FX Onboard
    MonitorASUS PG278Q Swift 27" GSYNC
    Storage #1Samsung 970 Evo 1 tb
    Storage #2Samsung 970 Evo 500 gb
    CPU CoolerCoolerMaster 360ML RGB
    CaseCorsair 900D
    Power SupplyCorsair 1200 AXi
    Keyboard Razer BlackWidow Chroma
    Mouse Razer Diamondback Chroma
    Headset Razer Kraken Chroma
    Mouse Pad Razer Scarab - Battlefield 3 theme
    Headset/Speakers Logitech Z-680
    OS Windows 10 x64 1809
    pennyboy's Avatar
    Join Date
    Sep 2013
    Reputation
    10
    Posts
    173

    Quote Originally Posted by Volt View Post
    Duke and pennyboy have good points. You really should try it out, and see for yourself, which you like better. It's a subjective thing, as to which you'll like better.

    Although 4k + gysnc would be even better!

    ASUS announced at CES last month it's coming out with a 27" 4k/60Hz (max refresh rate) g-sync IPS monitor.

    http://rog.asus.com/393642015/gaming...cd-with-gsync/
    Cool, I didn't know this was coming out. Not a monitor im interested in though, 60hz is not fast enough for me. Great for people who want IPS panel though!

  5. #15
    New ROGer Array xenonite PC Specs
    xenonite PC Specs
    Laptop (Model)Custom Sager NP8258-S; Intel i7-4940MX; NVIDIA GTX 980M 8GB; 32GB DDR3 1866MHz 9-9-9-27; 2x256GB RAI
    MotherboardASUS ROG Rampage V Extreme
    ProcessorIntel Core i7-5960X @ 4.6GHz, VCORE @ 1.4V
    Memory (part number)32GB @ 3300MHz 1T 15-15-15-34; 2 x F4-3200C15Q-16GRKD
    Graphics Card #1GTX 780TI GAMING 3G LE; skyn3t VBIOS @ +250 CORE (1600MHz average), 8GHz effective RAM clock; no-SLI
    Graphics Card #2GTX 780TI GAMING 3G LE; skyn3t VBIOS @ +250 CORE (1600MHz average), 8GHz effective RAM clock; no-SLI
    Graphics Card #3SLI almost always disabled
    Sound CardASUS ESSENCE ONE with custom TI OPA1612 OpAMPs
    MonitorASUS ROG Swift PG278Q
    Storage #11TB; 2x 512GB SSD Samsung 850PRO; RAID 0; OS and main APPS disk.
    Storage #216TB; 4x 4TB WD Black 7200 RPM, 64MB HDD; RAID 0; DATA and secondary APPS disk.
    CPU CoolerSwiftech MCP35X2; 2x XSPC EX240 Crossflow & 2x EX140; 2x EK-FC780 GTX Ti; EK Supremacy EVO ELITE
    CaseCooler Master STRYKER
    Power SupplyCORSAIR AX1200i
    Keyboard Coolermaster Quickfire Rapid RED 10 key less & DAS Keyboard CHERYY MX Brown
    Mouse LOGITECH G502 PROTEUS CORE
    Headset SENNHEISER HD800
    OS Windows 7, 64-BIT
    Network RouterNETGEAR AC3200 Nighthawk X6

    Join Date
    Feb 2015
    Reputation
    10
    Posts
    2

    Answer for OP and pennyboy and Reply to Duke of Alinor

    Quote Originally Posted by Gphish View Post
    Is it really for those who don't have the GPU power that this technology really benefits those who don't? I never have a problem I'm running two 690's and I never have had any issues except for Battlefield 3 but that has been taken care of long ago. Interested in the feedback.
    Well, as I understand it G-Sync was designed for two primary use cases:
    1) Gamers who are bothered by both screen-tearing and V-Sync input lag, whose maximum frame-rate frequently exceeds their
    monitor's refresh-rate.
    2) Reducing the "judder" associated with a varying, mismatched frame- and refresh-rate.

    Related to no.1:
    Quote Originally Posted by Gphish View Post
    How is it possible to go beyond the limitations of the monitors refresh rate? and what is your support of this? Where do I find actual proof of this?
    Quote Originally Posted by pennyboy View Post
    Easily, especially if you have high end video card(s). Just run up FRAPS and you can see, if your monitor is 60hz then any frame rate exceeding 60fps is exceeding the limitation of your monitor... there is the proof.
    Trust me you want G-sync, but if you don't know any better, then you don't' know what your missing out on. Hence you can save $$. But once you have G-Sync you won't go back. Just my 2c

    Cheers
    Without G-Sync a computer monitor usually accepts input frames as fast as possible. This leads to tearing if your frame-rate is higher than the refresh-rate, since the next frame will begin to be drawn before the current frame has finished drawing creating an interface between these two frames. This interface is what is called a 'tear' if the two frames' contents are sufficiently different, at the applicable joining positions, so that the final composite image does not seamlessly come together.

    G-Sync prevents this 'tearing' from occurring by continuously polling the monitor and only sending the next frame when the monitor has completely finished drawing the current frame. Consequently, G-Sync does put a certain amount of back-pressure on the rendering pipeline, potentially leading to some increased input latency (lag), but this amount of latency is only a small fraction of the latency added by V-Sync.

    G-Sync then, basically, is a low-latency form of V-Sync in that it also caps your maximum frame-rate to your maximum refresh-rate without dropping your frame-rate to integer fractions of your refresh-rate as soon as the former drops below the latter. Technically, running without any form of G-Sync or V-Sync does allow your frame-rate to "bypass" the refresh-rate limit of your monitor, however since the monitor can still only display one complete frame in each 'frame-time' (the reciprocal of frame-rate) the output effect will be two or more incomplete frames, from different times in the rendering pipeline, simply chained together to form one frame.
    Personally I always use G-Sync, otherwise, lining up a head-shot when the enemy is moving fairly quickly would involve aiming at a head that is not actually connected to its body, which for some reason or another, I find quite hard to actually pull off.

    As for the above second use case:

    I have tested the effect sub-60Hz G-Sync by limiting my maximum frame-rate to 60 fps and also applying an amount of anti-aliasing sufficient to ensure that the instantaneous frame-rate varies between 40-60 fps. After having extensively tested this setup with a variety of games and applications (including Nvidia's "Pendulum" demo) I have come to the unfortunate, but equally unsurprising, conclusion that the jerky motion inherent to a sub-60 fps video is enough to almost completely negate the benefit of G-Sync removing rate-mismatch-induced judder. G-Sync did improve the smoothness of the video/games, but it was never able to produce a perfectly smooth (at least to my naked eye) illusion of motion, no matter what frame-rate I tried.

    The best description I can provide of the effect of sub-60Hz G-Sync, is that it is similar to trying to watch a movie without any frame interpolation to smooth-out the motion which, for me, inevitably leads to a very unpleasant headache together with nausea and confusion. As most people do not experience such motion sickness, even after several hours of continuously viewing a normal 2D video/game at average frame-rates higher than about 15 fps , I do not believe that it will have much of an impact on the total commercial success of the technology; however, since the improvement in apparent smoothness is so limited for sub-60Hz content, I still believe that the largest use case for this technology will be replacing V-Sync.

    As an interesting aside, I noticed during my testing that a G-Sync enabled setup was able to portray 'playable' motion (not perfect at all, but smooth enough for me to reliably detect the correct direction and relative speed with which a target moved, while only minimally increasing the time that I required to position my crosshairs on the target) from frame-rates as low as 110 fps up to the standard 144 fps, without any intermediate frame-rate exhibiting an excessive amount of judder.

    In conclusion, I believe that G-Sync has a great ability to improve the smoothness of high-frame-rate play, but I do not see it significantly improving the smoothness that can be achieved by a 60Hz-limited display.



    Quote Originally Posted by Duke of Alinor View Post
    G-Sync is just gouging the market until the VESA standard comes out.
    But it will not matter to me, the small improvement in smoothness is not as beneficial as 4K resolution. I have played both and with enough GPU power (two 290x or better) 4K wins.
    Give it a try, when you see your opponent from a farther distance you have more advantage than almost imperceptibly smoother game play.
    I think the real problem that you experienced is not one of 144Hz G-Sync being "imperceptably" smoother than sub-60 fps is on a 4K monitor. Since both your eyesight and your perception is clearly good enough to make use of the extra ppi being provided by the 4K display over a 1440p display, I find it highly unlikely that you are also biologically unable to notice the improved motion resolution offered by the higher-rate sample-and-hold output provided by a 1440p, 144Hz G-Sync monitor.

    Now I am not arguing that you were deliberately trying to make G-Sync look bad to further voice your frustration over the additional cost of acquiring G-Sync support; I actually believe that what you consider to be "enough GPU power" was not up to the task of providing a sufficiently low output frametime variance to demonstrate the improved smoothness of 144Hz video over 60Hz video. The extreme unevenness of multi-GPU rendering algorithms which can, even with the improvements made in successive 'frame-pacing' driver updates, easily reach an average of 3ms (for reference, check out any of the well-known articles on FCAT-based testing of frametime variance by well-known PC hardware revierw sites).

    Now a 3ms average frametime variance might not sound like a lot (and most of those same review sites that switched to FCAT-based testing also agree that it is an almost imperceptibly short variance), but that is because almost all comparisons have been focused on standard 60Hz displays. 3ms on the frametime of a 60Hz frame (16.67ms) is only a 18% change, however when using a 144Hz stream, the frametime is reduced to 6.94ms and the same 3ms variance suddenly becomes a 43% change in frame presentation timing. Such an unstable output frame-rate, even at a low average frametime, will obviously appear quite stuttery while the frametime variance also directly offsets the increased motion resolution, provided by the increased sample-and-hold frequency, by injecting temporal uncertainty (also known as time-noise or jitter) into the output.

    Basically, what you would need to accurately assess the personal benefit G-Sync offers you (if any) is a very stable high-frame-rate output to compare to an equally stable sub-60 fps output.
    I find it quite unlikely that you will then still fail to see a significant improvement in motion resolution (that is, the direct increase in spatial resolution of a displayed moving object due to an increase in temporal resolution as well as the LCD display's sample-and-hold output sample frequency); however, should that be the case, then (and only then) should you unequivocally state that 144Hz G-Sync holds no perceptible improvement over sub-60Hz video (whether V-Sync'd or not).
    Until then, please refrain from trying to demotivate an already stagnating industry based solely on statements that you have not even thoroughly verified yourself.

  6. #16
    ROG Guru: Yellow Belt Array pennyboy PC Specs
    pennyboy PC Specs
    MotherboardASUS Maximus XI Hero
    ProcessorIntel i9 9900k
    Memory (part number)G.Skill 32gb Trident Z RGB
    Graphics Card #1EVGA XC Ultra GeForce RTX 2080 Ti
    Sound CardAsus Supreme FX Onboard
    MonitorASUS PG278Q Swift 27" GSYNC
    Storage #1Samsung 970 Evo 1 tb
    Storage #2Samsung 970 Evo 500 gb
    CPU CoolerCoolerMaster 360ML RGB
    CaseCorsair 900D
    Power SupplyCorsair 1200 AXi
    Keyboard Razer BlackWidow Chroma
    Mouse Razer Diamondback Chroma
    Headset Razer Kraken Chroma
    Mouse Pad Razer Scarab - Battlefield 3 theme
    Headset/Speakers Logitech Z-680
    OS Windows 10 x64 1809
    pennyboy's Avatar
    Join Date
    Sep 2013
    Reputation
    10
    Posts
    173

    Quote Originally Posted by Duke of Alinor View Post
    G-Sync is just gouging the market until the VESA standard comes out.
    But it will not matter to me, the small improvement in smoothness is not as beneficial as 4K resolution. I have played both and with enough GPU power (two 290x or better) 4K wins.
    Give it a try, when you see your opponent from a farther distance you have more advantage than almost imperceptibly smoother game play.
    The thing is that Nvidia is more innovative than AMD, the whole synchronizing display refresh rates to the GPU would probably not exist today if Nvidia hadn't come up with it. AMD seems happy to always play catch up . Crossfire, 3D and so many other techs that AMD come out after Nvidia. I am not trying to sound like a fanboy or say AMD does not innovate at all, as they do but in my opinion Nvidia seems to come out with more products consumers actually want more than AMD does, first. SLI, G-Sync, 3D vision, Geforce Experience all came before AMD had a similar product.
    At the end of the day if you are a company and come up with something new that costs money to develop, you need to make money on your investment. If consumers want it, then why not sell it for a premium. Nvidia is a business they are in the market to make money, right? Like I said in my previous post G-Sync matters way more to me than 4k, in fact I don't really think 4k is that great at all. I have seen the monitors running games and I wasn't as impressed as when I first saw G-sync running, 1440p is enough and a perfect res for me and I demand 144hz as 60hz is not quick enough for fps's in my opinion.

  7. #17
    ROG Guru: Yellow Belt Array pennyboy PC Specs
    pennyboy PC Specs
    MotherboardASUS Maximus XI Hero
    ProcessorIntel i9 9900k
    Memory (part number)G.Skill 32gb Trident Z RGB
    Graphics Card #1EVGA XC Ultra GeForce RTX 2080 Ti
    Sound CardAsus Supreme FX Onboard
    MonitorASUS PG278Q Swift 27" GSYNC
    Storage #1Samsung 970 Evo 1 tb
    Storage #2Samsung 970 Evo 500 gb
    CPU CoolerCoolerMaster 360ML RGB
    CaseCorsair 900D
    Power SupplyCorsair 1200 AXi
    Keyboard Razer BlackWidow Chroma
    Mouse Razer Diamondback Chroma
    Headset Razer Kraken Chroma
    Mouse Pad Razer Scarab - Battlefield 3 theme
    Headset/Speakers Logitech Z-680
    OS Windows 10 x64 1809
    pennyboy's Avatar
    Join Date
    Sep 2013
    Reputation
    10
    Posts
    173

    Quote Originally Posted by xenonite View Post
    Well, as I understand it G-Sync was designed for two primary use cases:
    1) Gamers who are bothered by both screen-tearing and V-Sync input lag, whose maximum frame-rate frequently exceeds their
    monitor's refresh-rate.
    2) Reducing the "judder" associated with a varying, mismatched frame- and refresh-rate.

    Related to no.1:



    Without G-Sync a computer monitor usually accepts input frames as fast as possible. This leads to tearing if your frame-rate is higher than the refresh-rate, since the next frame will begin to be drawn before the current frame has finished drawing creating an interface between these two frames. This interface is what is called a 'tear' if the two frames' contents are sufficiently different, at the applicable joining positions, so that the final composite image does not seamlessly come together.

    G-Sync prevents this 'tearing' from occurring by continuously polling the monitor and only sending the next frame when the monitor has completely finished drawing the current frame. Consequently, G-Sync does put a certain amount of back-pressure on the rendering pipeline, potentially leading to some increased input latency (lag), but this amount of latency is only a small fraction of the latency added by V-Sync.

    G-Sync then, basically, is a low-latency form of V-Sync in that it also caps your maximum frame-rate to your maximum refresh-rate without dropping your frame-rate to integer fractions of your refresh-rate as soon as the former drops below the latter. Technically, running without any form of G-Sync or V-Sync does allow your frame-rate to "bypass" the refresh-rate limit of your monitor, however since the monitor can still only display one complete frame in each 'frame-time' (the reciprocal of frame-rate) the output effect will be two or more incomplete frames, from different times in the rendering pipeline, simply chained together to form one frame.
    Personally I always use G-Sync, otherwise, lining up a head-shot when the enemy is moving fairly quickly would involve aiming at a head that is not actually connected to its body, which for some reason or another, I find quite hard to actually pull off.

    As for the above second use case:

    I have tested the effect sub-60Hz G-Sync by limiting my maximum frame-rate to 60 fps and also applying an amount of anti-aliasing sufficient to ensure that the instantaneous frame-rate varies between 40-60 fps. After having extensively tested this setup with a variety of games and applications (including Nvidia's "Pendulum" demo) I have come to the unfortunate, but equally unsurprising, conclusion that the jerky motion inherent to a sub-60 fps video is enough to almost completely negate the benefit of G-Sync removing rate-mismatch-induced judder. G-Sync did improve the smoothness of the video/games, but it was never able to produce a perfectly smooth (at least to my naked eye) illusion of motion, no matter what frame-rate I tried.

    The best description I can provide of the effect of sub-60Hz G-Sync, is that it is similar to trying to watch a movie without any frame interpolation to smooth-out the motion which, for me, inevitably leads to a very unpleasant headache together with nausea and confusion. As most people do not experience such motion sickness, even after several hours of continuously viewing a normal 2D video/game at average frame-rates higher than about 15 fps , I do not believe that it will have much of an impact on the total commercial success of the technology; however, since the improvement in apparent smoothness is so limited for sub-60Hz content, I still believe that the largest use case for this technology will be replacing V-Sync.

    As an interesting aside, I noticed during my testing that a G-Sync enabled setup was able to portray 'playable' motion (not perfect at all, but smooth enough for me to reliably detect the correct direction and relative speed with which a target moved, while only minimally increasing the time that I required to position my crosshairs on the target) from frame-rates as low as 110 fps up to the standard 144 fps, without any intermediate frame-rate exhibiting an excessive amount of judder.

    In conclusion, I believe that G-Sync has a great ability to improve the smoothness of high-frame-rate play, but I do not see it significantly improving the smoothness that can be achieved by a 60Hz-limited display.





    I think the real problem that you experienced is not one of 144Hz G-Sync being "imperceptably" smoother than sub-60 fps is on a 4K monitor. Since both your eyesight and your perception is clearly good enough to make use of the extra ppi being provided by the 4K display over a 1440p display, I find it highly unlikely that you are also biologically unable to notice the improved motion resolution offered by the higher-rate sample-and-hold output provided by a 1440p, 144Hz G-Sync monitor.

    Now I am not arguing that you were deliberately trying to make G-Sync look bad to further voice your frustration over the additional cost of acquiring G-Sync support; I actually believe that what you consider to be "enough GPU power" was not up to the task of providing a sufficiently low output frametime variance to demonstrate the improved smoothness of 144Hz video over 60Hz video. The extreme unevenness of multi-GPU rendering algorithms which can, even with the improvements made in successive 'frame-pacing' driver updates, easily reach an average of 3ms (for reference, check out any of the well-known articles on FCAT-based testing of frametime variance by well-known PC hardware revierw sites).

    Now a 3ms average frametime variance might not sound like a lot (and most of those same review sites that switched to FCAT-based testing also agree that it is an almost imperceptibly short variance), but that is because almost all comparisons have been focused on standard 60Hz displays. 3ms on the frametime of a 60Hz frame (16.67ms) is only a 18% change, however when using a 144Hz stream, the frametime is reduced to 6.94ms and the same 3ms variance suddenly becomes a 43% change in frame presentation timing. Such an unstable output frame-rate, even at a low average frametime, will obviously appear quite stuttery while the frametime variance also directly offsets the increased motion resolution, provided by the increased sample-and-hold frequency, by injecting temporal uncertainty (also known as time-noise or jitter) into the output.

    Basically, what you would need to accurately assess the personal benefit G-Sync offers you (if any) is a very stable high-frame-rate output to compare to an equally stable sub-60 fps output.
    I find it quite unlikely that you will then still fail to see a significant improvement in motion resolution (that is, the direct increase in spatial resolution of a displayed moving object due to an increase in temporal resolution as well as the LCD display's sample-and-hold output sample frequency); however, should that be the case, then (and only then) should you unequivocally state that 144Hz G-Sync holds no perceptible improvement over sub-60Hz video (whether V-Sync'd or not).
    Until then, please refrain from trying to demotivate an already stagnating industry based solely on statements that you have not even thoroughly verified yourself.
    A well written post, I enjoyed reading it. I probably should of explained what g-sync was as I didnt in my post. My bad there!
    Last edited by pennyboy; 02-17-2015 at 03:06 AM.

  8. #18
    New ROGer Array xenonite PC Specs
    xenonite PC Specs
    Laptop (Model)Custom Sager NP8258-S; Intel i7-4940MX; NVIDIA GTX 980M 8GB; 32GB DDR3 1866MHz 9-9-9-27; 2x256GB RAI
    MotherboardASUS ROG Rampage V Extreme
    ProcessorIntel Core i7-5960X @ 4.6GHz, VCORE @ 1.4V
    Memory (part number)32GB @ 3300MHz 1T 15-15-15-34; 2 x F4-3200C15Q-16GRKD
    Graphics Card #1GTX 780TI GAMING 3G LE; skyn3t VBIOS @ +250 CORE (1600MHz average), 8GHz effective RAM clock; no-SLI
    Graphics Card #2GTX 780TI GAMING 3G LE; skyn3t VBIOS @ +250 CORE (1600MHz average), 8GHz effective RAM clock; no-SLI
    Graphics Card #3SLI almost always disabled
    Sound CardASUS ESSENCE ONE with custom TI OPA1612 OpAMPs
    MonitorASUS ROG Swift PG278Q
    Storage #11TB; 2x 512GB SSD Samsung 850PRO; RAID 0; OS and main APPS disk.
    Storage #216TB; 4x 4TB WD Black 7200 RPM, 64MB HDD; RAID 0; DATA and secondary APPS disk.
    CPU CoolerSwiftech MCP35X2; 2x XSPC EX240 Crossflow & 2x EX140; 2x EK-FC780 GTX Ti; EK Supremacy EVO ELITE
    CaseCooler Master STRYKER
    Power SupplyCORSAIR AX1200i
    Keyboard Coolermaster Quickfire Rapid RED 10 key less & DAS Keyboard CHERYY MX Brown
    Mouse LOGITECH G502 PROTEUS CORE
    Headset SENNHEISER HD800
    OS Windows 7, 64-BIT
    Network RouterNETGEAR AC3200 Nighthawk X6

    Join Date
    Feb 2015
    Reputation
    10
    Posts
    2

    Thank you pennyboy, I really appreciate you taking the time to read through all of that .

    Also, I wholeheartedly agree with your assessment regarding G-Sync being a very innovative technology that Nvidia developed.
    I would just like to add an extra point of consideration: Time-to-market.
    Yes, of course Nvidia has to recoup their immense RND expenses, but another contributing factor to the (admittedly quite hefty) price of G-Sync support is the fact that Nvidia chose to throw a not-insignificant amount of processing power at the problem (i.e. the first G-Sync module housed a custom FPGA module). While a lot of people like to point out how a freesync implementation has the ability to be much cheaper than a comparable G-Sync one, it will also take a very long time to develop, validate and gather industry-wide support for a brand-new standard.
    Nvidia's solution allowed G-Sync enabled monitors to get to market years in advance of the first freesync display; all while not increasing the prices of their Graphics Cards (for those who do not want G-Sync) at all, since their custom hardware module resides completely inside of the display.

    When freesync finally does make its debut, Nvidia will of course also be able to support it on their own products. At that time, if they choose not to implement freesync and if freesync turns out to be superior to G-Sync in both cost and quality, then I believe we can all justifiably be quite upset with their choice. Until then though, G-Sync remains the only option; and I am very glad indeed that we as consumers at least already have it.

Page 2 of 2 FirstFirst 1 2

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •