[ad_1]
This week, NVIDIA introduced the Turing-optimized GTX 1650 graphics card for $ 149. On launch day, I took the ASUS GeForce GTX 1650 4G Dual Fan Edition graphics card (Dual-GTX1650-O4G) for Linux testing and I now have the first Linux Ubuntu GTX 1650 performance tests compared to a assortment of older AMD Radeon and NVIDIA GeForce graphics cards.
For $ 149 + USD, the GeForce GTX 1650 features 896 CUDA cores, a 1485 MHz base clock with a 1665 MHz amplified clock, 4 GB GDDR5 video memory, NVENC video capabilities based on Volta (not the latest NVENC Turing, but still pretty good, compared to previous generations of NVIDIA graphics processors) and only has a 75-watt TDP, which means no PCI Express external power connector 'is required.
In the case of the ASUS Dual-GTX1650-O4G, I was able to acquire it on the day of the launch for $ 160, although other models actually reached the price of $ 149. This particular ASUS SKU uses the same 1485 MHz base clock, but its GPU booster clock can reach 1725 MHz compared to the 1665 MHz reference clock. There is also an ASUS GPU reminder clock mode on Windows to reach 1755 MHz. No manual overclocking has been attempted with this graphics card because you can learn about GPU overclocking on many other websites while focusing on Linux support and performance aspects.
The ASF GeForce GTX 1650 dual fan edition provides outputs for DVI-D, HDMI 2.0b and DisplayPort 1.4. The GTX 1650 supports the management of three screens simultaneously. This ASUS graphics card with two fans is a standard two-slot form factor and the card measures 20.4 x 11.5 x 3.7 cm.
This GTX 1650 graphics card worked perfectly under Linux with the new NVIDIA 430.09 beta Linux driver. The first set of tests came from Ubuntu 19.04 x86_64 with the Linux 5.0 kernel. So far, no problem has been encountered in benchmarking various Linux OpenGL and Vulkan games (including Steam Play / DXVK titles) and some OpenCL compute workloads. / CUDA.
[ad_2]
Source link