The arrival of the NVIDIA RTX 2080 Ti is delayed again

NVIDIA asks for patience and apologizes for the delays of its RTX 2080 Ti

A few months ago, NVIDIA presented its new graphics cards with Turing architecture, the RTX 2080 and RTX 2080 Ti. While in the case of the first there are units available for sale, the second or there are either a few or not available at all. In the case of the NVIDIA Founders Edition, availability is delayed again.

Only a few days passed after the official launch and presentation (September 20) when NVIDIA itself stated that the GeForce RTX 2080 Ti Founders Edition would not be available by that date. That managed to anger many critics because the pre-reserves were limited units until empty stock, where after would not be available from its website. That is, they knew perfectly the stock they had to be able to offer their users such cards, but for some reason all the dates are delayed.

The only understandable thing would be that they did not expect such a good reception and left the stock too long open, but on the other hand, perhaps they did it on purpose to attract more customers even if there were delays. In any case, this is speculation and we assume that we will never know. What we do know is that NVIDIA itself, through an official statement, has postponed the delivery of said RTX 2080 Ti between October 5 and 9.

According to sources, GeForce RTX 2080 Ti Founders Edition is in stock is likely to be in the mid- to late-October window at the earliest. While the RTX 2080 Ti Founders Edition is priced at $1199 with the standard MSRP at $999, $1200 is the observed going price for all 2080 Ti cards on the market and in general inflated prices will be the norm for weeks or perhaps months, something to be expected with new video card releases.

All these delays point directly to an unspecified and unclear problem, NVIDIA simply says that they have problems in the supply chain and that they can meet the needs and dates of customers for their RTX 2080 Ti is assuming a ” challenge”.

We hope Nvidia can solve this problem ASAP.

The new screenshot tool for Windows 10 found in the October 2018 Update

Microsoft has released the new version of Windows 10 this week. Among the new features of the October 2018 Update is a new screen capture tool called “Cut and sketch”. This is a mixture of the option to capture the screen and the program “Clipping”, which was already preinstalled in Windows 10. This new tool can be invoked either by looking for its name in the Cortana menu or by using the shortcut of Windows keyboard + Shift + S.

At the moment it coexists with “Cuts”, although Microsoft has inserted a notice inside the application that says that “Clipping is going to move to a new location”, at the same time it invites to try “the improved features of Clipping and annotation”. Let’s get to know this new function thoroughly.

A vitaminized version of Snippets to make screenshots.

When you invoke “Crop and sketch”, the first thing that appears is four buttons at the top. The first serves to capture a box on the screen, the second to make a free form and the third to capture the entire screen. The one with the “x” symbol serves to close the application (although you can also use the “esc” key). To capture the screen, left click and hold while sliding.

Once you have finished, lift your finger. The screenshot will be saved in the clipboard and a notification will appear that, if you press it, will take you to “Cut and sketch”. This consists of a series of very simple functions that serve to make annotations in the capture that you have just taken. Starting from the left, the first one is used to make notes by means of tactile writing, in case you use a graphic tablet or a Surface. The second is the classic marker, which serves to paint on the image. Just to your right is the pencil, which serves the same thing only that has a different texture. The following is the highlighter. From these three tools you can change both the color and the thickness of the stroke. To do this, right click or double left click on the icon and select the one you want.

The fifth option, of course, is the eraser. You can use it by sliding over the image, clicking on the line you want to delete or you can right click on the icon and select “Delete all the pencil entries” to eliminate everything you have drawn. Although this tool is focused on the screenshots, you can also open any image on the hard disk to edit it by pressing the icon of the folder The penultimate is used to add a ruler millimeter or a protractor, in case you want to make sketches with Higher level of accuracy or want to measure some element of the image. In the case of the rule, if you scroll with the mouse wheel you will tilt it to the right or to the left, according to the direction in which you move the wheel. In the case of the transporter, this same gesture will serve to make it larger or smaller.

The last one, which can be found to the right serves to cut the image. You will simply have to drag from one of the corners until you get the right position. Unfortunately, there is no command to trim while maintaining proportions, as can be done in Photoshop by holding the Shift key

Once you have finished editing the image, you can do several things. In the upper left area you have the option to save image as PNG, copy (to paste directly into another program or web page) and share, which by default includes OneNote and the mail app, although it is compatible with some apps from the Microsoft Store.

NVSlimmer: How to remove bloatware from Nvidia drivers

This simple program allows us to modify the installation package.

The latest version of the Nvidia drivers for Windows occupies more than half a gigabyte. Basically it behaves like an operating system, installed inside another operating system. It would be great if the company offered “lite” editions (for example, without telemetry or GeForce Experience) as well as a way to customize the entire process, without hiding modules.

In other words, we would love to know how to remove the bloatware from the Nvidia drivers, but until an official response appears, we can use the new NVSlimmer.

How to remove bloatware from Nvidia drivers with NVSlimmer

520 megabytes. That is what occupies the installer of the version 411.70 of the driver Nvidia for Windows 10 64 bit edition. There are Linux distros much smaller than that, and an unpatched ISO image of the old Windows XP SP3 floats around 600 megabytes. It is supposed to be accepted, that graphic cards have become very complex and powerful, and that they need giant controllers. Fortunately, video game enthusiasts love to challenge official positions, and over time have discovered how to remove bloatware from Nvidia drivers, modifying some .cfg files and deleting folders. The manual process is effective but tedious, and the uKER user of the Guru3D forum decided to automate it a bit. The result is NVSlimmer.

NVSlimmer is in a relatively early stage of development, and users have not hesitated to report problems under more extreme configurations, but its basic functionality is impeccable. The idea is that everything that is not marked in the NVSlimmer interface will be removed from the controller, while the modules considered “critical” are pre-selected. There are three in total: Core Display Driver, Install Core, and PhysX. HD Audio is recommended in case of using audio via HDMI, and the rest of the dependencies is still a work in progress, but we already have a fairly solid idea of the close relationship between some modules.

Once the purging of the controllers is finished, we can proceed to its formal installation, or perform a repackaging, which will create a decompressible .exe. In the specific case of the 411.70 driver, when choosing the three essential modules and HD Audio, the reduction in the installation pack was 144 megabytes, but the most important thing is that the bloatware will not be in our system. As always, any operation involving modified drivers should be considered experimental. There is no support here beyond asking in the forums of Guru3D, and if something goes wrong, you are on your own. Finally, remember that the bloatware is not exclusive to the controllers. You also have to remove it in new computers, and of course, in Windows 10.

Official site: Click here

Microsoft presents the Surface Pro 6 featuring the eighth generation of Intel processors

Microsoft has unveiled its new tablet for professionals, the Surface Pro 6.

The tablet integrates the eighth generation of Intel processors. Unfortunately, they have not incorporated the USB-C ports. Little change was expected compared to the previous generation.

The most important novelty of this device is inside, they have decided to jump to the eighth generation of Intel processors, which already incorporates a four-core chip. Microsoft mentions that it is 67% faster than the previous model.

The exterior design remains intact but they have added a new color; matte black, which looks very elegant on the device. Unfortunately, it has been confirmed that the sixth generation will not have UBS-C ports, so this feature will wait until next year. Some expected a completely renewed design, but the latest rumors suggest that this will happen in 2019.

The Verge is reporting that at an event in New York. “Microsoft today showed off its new line of Surface products, including updates to the existing laptop line and a new product: the Surface Headphones. The laptops also come in an additional new color that should please fans of all things dark mode.”

There are no changes to the screen, the resolution and the size are still 12.3 inches. Regarding the battery, it will have battery life of 13.5 hours, enough to perform daily tasks. Although few will notice, the interior has been redesigned to improve the cooling of the components. It is possible to configure the hybrid tablet with up to 16GB of RAM. The input model with Core i5 processor will be priced at $ 899 and will be available from next October 16.

Surface Laptop 2

Also released was the Surface Laptop 2 with all new generation of Intel processors and more RAM in the base model. The design remains intact.

Microsoft has taken advantage of its October event to present the second generation of the Surface Laptop. Following the line of the other devices presented, the Surface Laptop 2 maintains the design of the previous model, however, it renews its internal components to improve its performance. The Surface Laptop 2 is 85% faster than the previous generation.

Its most significant novelty is found in the incorporation of the eighth generation of Intel processors, which allows to increase its speed by 85%. The battery life supported is about 15.4 hours for continuous video playback. Microsoft notes that the screen has the thinnest LCD panel on a laptop with a touch screen. Regarding the colors, the existing ones are conserved but the matt black is added. In this computer have not been encouraged to integrate the USB-C ports, which certainly discourage many.

The base model comes with 8GB of RAM, Intel Core i5 processor and 128 GB of SSD storage. It will have a price of $899, $100 less than the previous device. Its highest configuration reaches $1,500. It will be available for sale from next October 16, but it can be booked from today.

Nvidia Quadro RTX 6000 and RTX 5000 already in presale

Nvidia has opened pre orders of its new Quadro RTX 6000 and RTX 5000 graphics cards based on its advanced Turing architecture on its website. We give a review of the prices of these new cards, as well as their most important characteristics.

Nvidia has already pre-sold the new Quadro RTX graphics cards based on the advanced Turing architecture. The new Nvidia Quadro RTX 6000 graphics card is priced at $6,300, and there is a quantity limitation of 5 units per customer. On the other hand, the Nvidia Quadro RTX 5000, has a price of $2,300 and is already sold out at the time of writing this article.

The Quadro RTX 6000 model maximizes the TU102 silicon with the inclusion of 4,608 CUDA cores, 576 Tensor cores, 72 RT cores, and 24 GB of GDDR6 memory, across a 384-bit memory bus width. This makes it the cheapest graphics card to use the Nvidia TU102 silicon in all its glory. We recommend reading our post about Nvidia announces the Quadro RTX card, the first capable of running Ray-Tracing The Quadro RTX 8000 model, which has a price of $ 10,000, but is not yet available to book, equips the same core TU102 with 48 GB of memory and clocks higher than the RTX 6000.

As for the Quadro RTX 5000, this unit maximizes the TU104 silicon with 3,072 CUDA cores, 384 cores, 48 ??RT cores and 16 GB of GDDR6 memory through of the 256-bit interface of the chip.

Recall that the Quadro series comes with a set of business features, and certifications, for the main content creation applications, which are not available in the GeForce series. Therefore, these cards are more suitable for use in the professional world, although GeForce can also be used. The Quadro series also uses higher quality components, to ensure greater strength during 24/7 use.

What TV to buy (2018) top recommended models from $600 to $3,500 dollars

The resolution of the present and near future is 4K UHD If you are going to buy a TV over 40 inches, do not think about it: bet on a 4K UHD resolution over any Full HD panels. In fact, from 40 inches this is the predominant resolution and it is possible to buy televisions with this resolution of top brands for just over $600 dollars. Although we still do not have all the bells and whistles from brands such as LG that have already presented models in 8K, that continues to grow at a good pace: to streaming platforms such as Netflix or Amazon Prime, who continue to boost their catalog of series, films and documentaries in 4K and we have to add Blu-ray 4K movies and the growing number of releases of great titles for PlayStation or Xbox.

The 4K content catalog continues to grow: video streaming services, 4K Blu-ray discs and great video game releases

In addition, thanks to the scaling of 4K UHD TVs it is possible to enjoy the wide 1080 content at very good quality, so the abundance of content in Full HD is no reason to opt for that resolution.

HDR yes, but of what kind

From the hand of the 4K UHD comes HDR (High Dynamic Range) technology that we have already talked about and and we could read more from sites such as Engadget. HDR allows, if the content is compatible, to reproduce a wider dynamic range, so that it recovers more information in dark and bright areas. The result? Some images more natural and similar to what our eyes see in reality.

Currently in the market there are several HDR standards competing to prevail, but there are some that we must take into account especially when buying a TV in 2018:

  • Dolby Vision, which reproduces the color with a depth of up to 12 bits, a level of brightness up to 10,000 nits and employs dynamic metadata, adjusting the dynamic range in real time in each photogram. It is the most ambitious, yes, also the most expensive since it belongs to Dolby.
  • HDR10 is an open standard, which allows any manufacturer to use it without paying a license. This technology reproduces color with a depth of 10 bits and the maximum brightness level is 1,000 nits. As it lacks dynamic metadata, it is constant during playback. Some manufacturers like Samsung use different names, so the South Koreans use HDR 1000, an equivalent standard and HDR 2000, a variation with a higher level of brightness.
  • To solve the lack of HDR10 dynamic metadata there is the HDR10 +, an evolution that, being also free, incorporates them. On a visual level, HDR is even more impressive than the jump from Full HD to 4K, as long as it is well implemented. Hence, all our mentions include it.

OLED or LED?

In the market exist two technologies for panels: LCD with LED backlighting (and derivatives such as Samsung’s QLED) and OLED. Each of them is indicated for different publics and uses so it is a matter of your needs. OLED technology (Organic Light-Emitting Diode) is the newest and unlike the LCD, it is able to turn the pixels on and off individually. As a result, blacks are more pure and realistic, the contrast level is better and the colors stand out more. Another advantage is that it offers a perfect image from any point that we watch TV. Its biggest handicap are the retentions and burns, an effect of the use and deterioration of the organic diodes that make it up.

Although OLED displays integrate sophisticated processing systems to minimize it, OLED televisions are discouraged for use as a computer monitor or if the TV is left on for many hours a day (roughly, more than 4-6 hours). OLED screens reign on TVs with large diagonals (from 55 inches), so that below these dimensions there is no discussion. What if I look for a big TV to have it on for many hours? Then your choice must be LCD, the key will be in the FALD or Full Array Local Dimming. Decanting by LED or OLED will be a question of needs, but if you choose LED, the FALD backlight is fundamental.

At the beginning of this point we explained that the OLED turns the pixels on and off individually. On the LCD panels, backlighting systems are used on the back of the panel. Currently two are predominant: LED peripheral backlighting and LED backlighting through FALD matrix. While the peripheral backlighting uses a set of mirrors placed on the periphery of the panel to distribute light evenly, the FALD incorporates a grid of LED diodes located behind the panel.

This distribution is more expensive and complex, but allows a more precise control of the lighting, so that the contrast and native colors are better. If we are concerned that the angle of vision of the LED is less than the OLED, an alternative is the Samsung QLED models, although its price is much higher. Beyond the technology of the panel, an element that has a lot of impact on the quality of the images we see on the television is the image processor and the algorithms used. In this article you can see how processors affect mid-range and high-end TVs from some manufacturers.

Not all HDMI are the same

Although it is taken for granted that the television that we are going to buy has HDMI connections, not all are the same. In fact even in the most premium models we find that all the available jacks, only one corresponds to HDMI 2.0 or later. This standard allows the transport of 2160p signals at 60 Hz. The rest will be HDMI 1.4 sockets, whose signal transfer rate will be 2160p at 30 Hz or less.

Currently the most modern standard on the market is the HDMI 2.0b, with which we ensure support for new HDR technologies, 4K content at 60fps with a bandwidth of 18Gbps and can handle up to 32 audio channels. However, HDMI 2.1 has already been introduced. If we do not want our TV to become obsolete in this sense, it is advisable to acquire a model that has the highest number of HDMI 2.x connections.

Smart TV

In the field of Smart TV there is much to discover. Currently with a Smart TV it is possible to control the device through voice, play, access menus, download applications and services in streaming, configurations … but it is a service on the rise. In the market we mainly find Tizen in Samsung TV, WebOS in LG and Android TV in other manufacturers such as Sony. However, taking into account the youth of the sector and that the average life of a TV exceeds 5-10 years, it is likely that these platforms will become obsolete before we buy another TV. Something that should not be a problem thanks to an external receiver or multimedia center that we can acquire at the time. So the fundamental thing in this section is to make sure we have access to the most outstanding streaming services, such as Netlflix, Prime, HBO or YouTube and that it has a fluid menu. As the interior hardware is more powerful in TVs of medium and high ranges, it is normal to offer better performance.

Less than $600 dollars

Both for those users who do not want to invest too much in a TV and for those who do not need the best of the best, there are alternatives in the market with a great value for money. For this budget we can find very interesting options around the 40 inches, LED LCD panel and 4K UHD resolution. Here are some models to look for:

  • LG 43UK6470
  • Philips 50PUS6162
  • Toshiba 49U6763DG

Up to $1200

In this price range we find a lot of televisions within the LED LCD technology but with a variety of features. So, we have opted for four differentiated models of different sizes, but also with some curved screen, Ambilight system and original design without neglecting screen quality.

  • Sony KD-43XE7096
  • Samsung UE49NU7305
  • LG 55UK6470
  • Philips 6700 Series 65PUS6703
  • Panasonic TX-55FX740E

Up to $1,700 dollars

From this price we find large panels with very good image quality thanks to improvement technologies. In addition, it is possible to find QLED TVs, especially in medium ranges and contained sizes.

  • LG 65SK7900PLA
  • Samsung QE55Q6FN
  • Sony KD-55XF9005

Up to $3,500 dollars

In this price range we find large screens and there is also room for the best technologies in panels.

  • Sony KD55A1BAEP
  • LG 65SK8500PLA
  • Samsung QE65Q9FN

 

NVIDIA NVLink vs. SLI: differences and performance test

The appearance of the NVIDIA NVLink connector, as a substitute for the traditional SLI connector, in the new NVIDIA GeForce RTX 2000 graphics cards, has made an enthusiast wonder why the change and if one technology is better than the other to play. In this tutorial we will explain both and we will be able to see, with data, which of the two is better.

Before getting into the matter, it would not hurt for us to explain the differences between the NVIDIA NVLink and the SLI. The SLI (Scalable Link Interface) is a technology that NVIDIA brought to the market in 2004, which is based on the Scan Line Interleave technology developed by 3dFX for its Voodoo 2 graphics cards. Ideally, this technology allows the burden to be distributed of work between several equal graphic cards, being able to multiply the capacity of general computation of the system.

In an SLI configuration, there is a graphic card, which acts as a Master, being the rest of the configuration, slave cards, to which the master card directs. However, an important drawback of this technology lies in the bandwidth available for the graphics to share information. But, above all, there is the fact that it is a unidirectional bus. This fact creates a series of latencies in the system that mean that, as the number of cards on the bus increases, the increase in power decreases proportionally.

Comparison of performance between NVLink and SLI

The NVIDIA GeForce GTX 1080 Ti, Quadro GP100 and Quadro GV100 graphics cards have been used to compare the performance between NVLink and SLI. In the case of these last two, both the SLI mode and the complete NVLink mode have been used, since these professional graphics cards support both modes.

As we can see, between the graphics cards with similar architectures, such as the GeForce 1080 Ti and the Quadro GP100, there is hardly a big difference in the performance in the games between the SLI and the NVLink, except in the case of the game Far Cry 5, which presents a quite spectacular performance increase. But the general trend is that there is practically no increase in performance between both technologies. This would confirm NVIDIA’s claim that the difference in performance of graphics cards in multi GPU configurations between SLI and NVLink is minimal, as demonstrated by the results obtained in the benchmarks.

HDR on the RTX 2080 and 2080 Ti: do they lose performance as in Pascal?

A little over a week ago we heard about the fact that the Pascal architecture was having problems with some games with HDR support. So far we have little data about Turing in this regard, so is it possible that the RTX 2080 and 2080 Ti lose when activating HDR?

NVIDIA has not yet commented on the reason for Pascal’s problems in certain games, since with others it is not an issue and the performance of SDR vs. HDR is practically the same. All indications point to a problem in the drivers, specifically in the HDR YCbCr 4: 2: 2 configuration (but not in the YCbCr 4: 4: 4 or RFB modes) where the Pascal cards when performing the chroma subsampling “mysteriously” they lose performance 8 vs 10 bits: what changes on a screen with more depth of color?

It seems that the tone mapping necessary for HDR is done by software in the 1000 series, while in Turing that mapping is already done via hardware and from there the problem of the drivers seems to come. But nothing could be further from reality, recently Toms Hardware made a series of tests to try to prove whether the loss was something exclusive of Pascal or, on the contrary, Turing was also affected. The only “problem” is due to the use of the driver version; the RTX were tested with the 411.51 and the rest of the NVIDIA cards with the 398.82. Saving this impediment, let’s proceed to see the data:

Forza Motorsport 7 is one of those games where the losses were more obvious, to be specific all GPUs have a greater or lesser loss. The GTX 1080 Ti loses 13.7%, the RTX 2080 2.5% and finally the RTX 2080 Ti loses 1.2%. It seems that the more power loss.

Farcry 5 is the most disconcerting of all the comparative games. The GTX 1080 Ti maintains the framerate and even achieves 0.2 FPS more, while the Turing cards lose a bit of performance: -2.7% and -3.7% respectively.

There is no solution in sight

The theories about the problem have already been enumerated and unfortunately they are that, theories. We have searched for data on this topic and the results are as disparate as reflective, since you can find the same problem in different games, where a user or web the result gives loss to others gives them gain and vice versa. We can not specify the problem, in fact, no one seems to be able to do so since this has been talked about and not given to the source of the evils. All this NVIDIA is knowledgeable and silent, we do not know if you are working on it or it is simply a failure that you know you can not fix for one reason or another. Meanwhile encouraged the owners of a Pascal card to update the firmware of their GPUs for support HDR + 4K + 144Hz since the version of DP 1.3 was causing problems and the upgrade was installed on the 1.4. Too many unknowns and few answers, we can not say more.

Ray Tracing: everything you need to know about this new revolution in videogames

The latest talk of town now at days seems to be about Ray Tracing. Undoubtedly soon after the release of the new RTX series video cards from NVIDIA. This technology will undoubtedly change the world of videogames, but many will be asking what it is and what its going to mean for technology.

Ray Tracing (RT from now on) is a technique based on an algorithm created by Arthur Appel called Ray Casting (1968) through algorithms to determine visible surfaces.
we can find more here.

Thanks to RT, 3D graphics can be rendered with complex lighting models simulating the physical behavior of light. Until now this process could not be done in real time, so it had to be processed first and later rendered to get said behavior of the light. This conventional 3D rendering process has so far used a process called rasterization, which uses objects created from a mesh of triangles or polygons to represent a 3D model of an object. This “rendering pipeline” then converts each triangle of the 3D models into pixels on a 2D screen, so that they can then be processed or “shaded” before the final display on the screen.

NVIDIA OPTIX

Ten years ago OptiX introduced the programmable shader model for ray tracing (OptiX GPU Ray Tracing). NVIDIA has continued to invest in hardware, software and algorithms to accelerate that programming model in its GPUs and it is now when they have presented it finished together with its RTX series. The OptiX API is an application framework that leverages RTX technology to achieve optimal ray tracing performance on the GPU. It provides a simple, recursive and flexible pipeline to accelerate the ray tracing algorithms. In addition, the post-processing API includes an AI-accelerated noise eliminator, which also leverages RTX technology. From movies and games to scientific design and visualization, OptiX has been successfully implemented in a wide range of commercial applications. These applications range from software visualization to scientific visualization (including Gordon Bell Award finalists), defense applications, audio synthesis and computer light maps for games.

Microsoft DirectX Ray Tracing or DXR

The DirectX Ray Tracing (DXR) API from Microsoft extends DirectX 12 to support raytracing. DXR fully integrates directX ray tracing, allowing developers to integrate this technology with traditional rasterization and calculation techniques by providing four new concepts for the DX12 API.

Vulkan

Vulkan is a multiplatform API for the development of applications with 3D graphics. It was announced for the first time in the GDC 2015 by the Khronos Group. Initially, it was presented by Khronos as “the next generation OpenGL initiative”, but then the name was discarded, leaving Vulkan as definitive. Vulkan is based on Mantle, another API of AMD, whose code was assigned to Khronos with the intention of generating an open standard similar to OpenGL, but low level. Unlike the Microsoft API, Vulkan can work on a wide range of platforms, including Windows 7, Windows 8, Windows 10, Android and Linux. NVIDIA is developing a ray tracking extension for the multiplatform computing and graphics API of Vulkan. It will be available soon according to NVIDIA, as this extension will allow Vulkan developers to access the full power of the RTX graphics. NVIDIA is also contributing to the design of this extension to the Khronos group as a contribution to potentially carry out a ray tracking capability among suppliers to the Vulkan standard.

So what games will support Ray Tracing?

It is something yet to be seen, at the moment there is not even a benchmark that gives full support to ray tracing and the closest thing to knowing a complete list of future titles is what NVIDIA showed:

This does not mean that those 21 games support Ray Tracing, but that they can also make use of artificial intelligence. Of all of them some have confirmed support such as: Asset Corsa Competizione, Atomic Heart, Battlefield V, Control, Enlisted, Justice, JX3, Mechwarrior V: Mercenaries, Exodus Metro, Shadow of the Tomb Raider and Project DH. The only problem with this technology is the high consumption of resources, since it seems that we will have to use specific units to increase it, just as NVIDIA has designed its Turing architecture. Seen the seen many are skeptical with Vulkan and DXR, since NVIDIA has shown empirically that without such units (RT Cores) the performance is reduced too much. To see the improvements of this technology and to finalize this article that better than seeing it in action in 3 of the main AAA titles that are going to come out or have already come to the market:

Metro Exodus

Shadow of the tomb raider

Battlefield V

Get ready to pay more for NVIDIA’s G-Sync HDR module in new 4K 144 Hz monitors

4K gaming is on the rise but some hard core gamers are still holding out waiting for 4K 144Hz monitor as it turns out the new NVIDIA G-Sync HDR monitors with 4K at 144 Hz, have now made this a possibility to reach the 144 Hz refresh rate through manual OC, since its refresh rate is 120 Hz Inside these monitors we find the chip that makes possible the use of NVIDIA functions.

Inside the monitor we find an FPGA manufactured by a company that is owned by Intel, Altera.This model has a high performance chip that is manufactured in the 20 nm process and provides the necessary broadband to process the data of graphic cards. This chip is not exactly cheap, and this may be one of the main reasons for the excessive price that cost roughly 2600 euros or around $2,000. Considering that the company will buy it on the order of thousands, a single chip may cost $500 dollars. This module has 3 GB of DDR4 memory to supply the necessary performance to the FPGA.

Its also reported that NVIDIA is working to bring 65? BFG (Big Format Gaming) displays which will be using the same modules to give true HDR playback in gaming and movies.