tinnitus wrote:The Monitor would work but I thought that freesync works only with AMD GPU. I'm using an AMD GPU and freesync Monitor it works great.
Don't let marketing define the technology.
It's a common mistake many make, especially here in the US where 90% of Americans have brand loyalty, even when another entity, often foreign investors, purchases a brand (and moves production off-shore). Many times things are branded when they are open standards. In fact, AMD's branding of "FreeSync" was, more or less, "reactive" to nVidia's G-Sync marketing.
E.g., the classic example I had from the '90s ... ► Show Spoiler
... when I went to Best Buy here in the US, and asked for a "Null Modem" cable. It took me 5 minutes to explain to them what it is, and one sales guy said, "Oh, that's a LapLink cable," and another guy almost got into an argument with the other sales guy and said, "No, that's a Microsoft INTERLNK cable." I already knew a LapLink serial cable (not to be confused with the faster parallel option) was a "Null Modem" cable, but sure enough, Microsoft was trying to re-brand it as a INTERLNK cable.
Microsoft was a huge investor in Best Buy, and controlled their distribution in the mid-to-late '90s -- including removing all Apple products from the shelf, sans MS Office for Mac. It was the iPod that forced Microsoft to allow Apple products back at Best Buy. This was also when Microsoft stepped back from subsidizing retail, as it was getting costly for them, and Linux solutions took over most retail outfits, including Best Buy.
Microsoft understands that, in the US, 90% of consumer knowledge comes from the superstore or, in the Amazon era, the on-line distributor. This goes back to the early '80s when Coke and Pepsi started subsidizing everyone in the US, from grocery stores to restaurants. Huawei is learning this first hand in the US handset market.
Heck, nVidia learned it couldn't break the Apple-Google-Samsung superstore/distribution model when it introduced is first tablet (which was the best tablet on the market, and one of the cheapest -- including LTE support), and could only sell through "Game Stores" as a "Gaming Tablet" (even though it blew away regular tablets), so the Google Pixel Tablet is basically the nVidia Shield Tablet "2", with a $300 mark-up by Google.
That is the US market, in a nutshell ... and also why you hear AMD FreeSync, and not VESA Adaptive Refresh, especially markets that are heavily influenced by US consumerism.
I'd say more, but I'm under NDA in many of them ... including one case of costing Microsoft $100M/year to the point they were willing to subsidize $50M/year of it, just to keep Linux out of their 'last bastion of retail.'
The London Stock Exchange (LSE) from 2001-2008 was another example, where Microsoft subsidized the integrator, Accenture. In the end, Microsoft blamed Accenture for the loss of the LSE, even though they were, essentially, Microsoft. No major exchange runs Windows since, especially with Linux 2.6 being 2 orders of magnitude faster (100x) at soft real-time, and NT falling way, way behind the original, late '90s Linux 2.0/2.2 v. NT comparison. It's why Microsoft's own Azure cloud is heavily Linux at its core software defined infrastructure too.
Same goes for the majority of mobile and set-top interfaces, with the PC being the last bastion. Most people only assume Linux is not as 'user friendly,' when they use more Linux interfaces ever day, than Windows. Heck, Windows is way, way behind Linux and Mac when it comes to scaling at 4K+. Windows apps are completely legacy in comparison.
Again ... It's because consumers are so focused on the superstore and, now, on-line distributor profit model. The software store model is the future, which is why Microsoft is making such a hard push to get everyone to their cloud and to their store, while Valve long realized it was coming, and came up with its Linux-based SteamOS.
Which explains why everyone knows when they are using a Microsoft system, but not so many others. I work at a smaller, regional bank now (so much more of a life-work balance), and trust me, we finally dumped Microsoft for several capabilities because they refused to let us re-brand it with our own.
The Adaptive Refresh is a VESA standard that is part of DisplayPort 1.2a and later. AMD has long been the early and primary supporter of DisplayPort, among other technologies.** AMD just branded it, enhanced it, including making their HDMI outputs support the Adaptive Refresh protocol. Those 'enhancements' have made it back into the standards because ... tada ... AMD works with VESA, and doesn't have any Intellectual Property (IP) ownership.
That's how most standards work, vendors agreeing to work with one another, and not make the consumer hostage.
It's also very common among VESA protocols in newer connection standards to be supported on older interfaces, as well as older protocols being updated on newer. DDC and EDID are great examples of how the main controllers of newer digital facilities ended up being supported on both analog and digital connections. That way the end-consumer, or tweaker for that matter, doesn't have to get involved with what port does what, etc...
nVidia just chooses to not support the VESA standard for Adaptive Refresh, especially not in DisplayPort, which has always had it since 1.2a (which is over 3 years old now ... enough for the GeForce 900 series, let alone 10). That's why people are assuming it's an AMD-only thing, even though the latest Intel GPUs are finally supporting the VESA Adaptive Refresh standard. Again, Intel is just often very late to the party, purposely. Chipzilla is all about those margins.**
**Intel is notorious for 'holding back' support ... ► Show Spoiler
... in not just their I/O peripheral chipsets, like Firewire, USB 3.0, Thunderbolt, etc..., but also in their GPU support. Intel was last with 4K support, AMD was first, with embedded chipsets. AMD was ahead of nVidia in DisplayPort support on add-in cards. AMD was also first with embedded DisplayPort (eDP) ... which drastically reduced the cost in notebooks, removing the need for the legacy LVDS interface. That's why there were a plethora of inexpensive AMD netbooks for years, while Intel took several years to 'catch up,' especially with AMD offering a true System-on-a-Chip (SoC). Once Intel had an Atom offering, the same models of AMD started to have Intel inside, although weren't always faster. In fact, Intel was also notorious for using old fabs to make chipsets, often at 65-90nm, while the CPUs were down to 32nm and even 22nm, but it was costing them major power (7-15W+), just for the chipset. AMD went fabless in the '00s, so they were using whatever TSMC or other process was latest.
These days Intel has gone more fabless for chipsets too, instead of upgrading old fabs, because they were getting beaten by AMD on chipset power. Intel now, also has some Pentium/i3 products that are finally SoC, not just Atom (which includes the Celeron/Pentium "J" and "N" products, which are Atom).
Heck, one might argue Microsoft is the same. They were the last with the office suite, browser, music player, etc... too.
tinnitus wrote:This may be an ASUS thing, if you crank up above 90Hz the monitor disables freesync until you set it back to 90 or lower.
It's up to the individual ODM on how they wish to design their controllers.
I'm sure ASUS found it couldn't guarantee their chip to handle adaptive at higher refresh rates, so higher timings, so it limited it to scaling down and up to 90Hz. But if you put the monitor in its fixed, maximum mode, then it 'just works' at that rate. Of course, you can get 'tearing.' But here's the thing ...
At 120, 144 and higher Hz, it's extremely difficult for the eye to see 'tearing,' as long as the updates are 70, 80 or 90+ times per second, let alone 100+.It's more of an issue down around 60Hz, and definitely under. Which is why most people are just buying FreeSync monitors, and running high, fixed refresh with nVidia cards.
I.e., if a consumer is going to spend US$300, 400 even 500, for a monitor, they are often going to spend at least US$200, if not 400, for a video card that can push most titles at those high, fixed refresh rates.The ODMs have responded with nearly an order of magnitude more monitors doing FreeSync in the past year, than G-Sync. G-Sync requires a design to be built around nVidia's chip, controlling everything from color and magnitude to the adaptive refresh. FreeSync is just a protocol, allowing the ODM to design the controller.
Of course G-Sync probably looks best in all cases. But is it really worth it? And, sooner or later, nVidia is going to be forced into supporting Adaptive Refresh. It's just a matter of when they finally capitulate, and then start marketing G-Sync as "better."