Budget Turing for $ 149, available today



[ad_1]

A little more than 8 months after it started, the launch of the NVIDIA GeForce Turing product stack is finally in sight. This morning, the company launches its latest GeForce Turing video card the least expensive, the GeForce GTX 1650. At 149 dollars, the newest member of the GeForce family is preparing to go up the back of the stack of products GeForce, offering the latest NVIDIA architecture in a low-power, 1080p-compromised video game card at an affordable price.

According to the very traditional mode of NVIDIA, the launch of Turing was a matter of substance. After launching the four RTX 20 series cards early in the cycle, NVIDIA's efforts over the last two months have been focused on filling the product stack. A variant of NVIDIA's GPU design, the TU11x series – what I've dubbed Turing Minor – is supposed to be smaller, easier to produce, which allows to produce chips that retain the fundamental Turing architecture, but remove it as well as the ray tracing nuclei (RT) and the AI-focused tensor nuclei. The final result of this bifurcation was the GeForce GTX 16 series, which is designed to be a leaner and meaner set of Turing GeForce cards.

To date, the GTX 16 series includes only the GTX 1660 family of cards – the GTX 1660 (vanilla) and the GTX 1660 Ti. Both were based on the TU116 GPU. However, today, the GTX 16 family is booming, with the introduction of the GTX 1650 and the new Turing GPU powering NVIDIA's smaller size card: the TU117.


Unofficial block diagram of TU117

While the GeForce GTX 1660 Ti and the underlying graphics processor TU116 were our first glimpse of NVIDIA's core products, the GeForce GTX 1650 is a much more pedestrian affair. The TU117 GPU below is, for all intents and purposes, a smaller version of the TU116 GPU, retaining the same basic Turing feature set, but with fewer resources. In total, coming from the TU116, NVIDIA has shaved a third of the CUDA cores, a third of the memory channels and a third of the ROPs, leaving a GPU smaller and easier to manufacture for this low-margin market. Nevertheless, with a size of 200 mm2 and 4.7B transistors, the TU117 is by no means a simple chip. In fact, it's exactly the same matrix size as the GP106 – the GPU at the heart of the GeForce GTX 1060 series -, this should give you an idea of ​​how the performance and number of transistors have been ( slowly) cascaded to cheaper products years.

In any case, TU117 will use many NVIDIA products over time. But for now, it all starts with the GeForce GTX 1650.

Comparison of NVIDIA GeForce specifications
GTX 1650 GTX 1660 GTX 1050 Ti GTX 1050
CUDA Cores 896 1408 768 640
ROPs 32 48 32 32
Basic clock 1485 MHz 1530 MHz 1290 MHz 1354 MHz
Boost Clock 1665 MHz 1785 MHz 1392 MHz 1455 MHz
Memory clock 8 Gbps GDDR5 8 Gbps GDDR5 7 Gbps GDDR5 7 Gbps GDDR5
Memory bus width 128 bits 192 bits 128 bits 128 bits
VRAM 4GB 6 GB 4GB 2 GB
Perf single precision 3 TFLOPS 5 TFLOPS 2.1 TFLOPS 1.9 TFLOPS
TDP 75W 120W 75W 75W
GPU TU117
(200 mm2)
TU116
(284 mm2)
GP107
(132 mm2)
GP107
(132 mm2)
Number of transistors 4.7B 6.6B 3.3 b 3.3 b
Architecture Turing Turing Pascal Pascal
Manufacturing process TSMC 12nm "FFN" TSMC 12nm "FFN" Samsung 14nm Samsung 14nm
Release date 04/23/2019 03/14/2019 25/10/2016 25/10/2016
Launch price $ 149 $ 219 $ 139 $ 109

From the start, it's interesting to note that the GTX 1650 is do not using a fully activated TU117 GPU. Compared to the complete chip, the version built into the GTX 1650 has a non-fuse TPC, which means that the chip loses 2 CUDA SM / 64 cores. The end result is that the GTX 1650 is a very rare case where NVIDIA does not do everything from the start – the company is essentially sandbagging – a point I will come back to in a moment.

In NVIDIA's historic product stack, it's a bit difficult to place the GTX 1650. Officially, it's the successor to the GTX 1050, which was itself a similar card. However, the GTX 1050 was also launched at $ 109, while the GTX 1650 is launched at $ 149, representing a significant price increase of 37% generation by generation. Therefore, you could be excused if you thought that the GTX 1650 was much more like the successor of the GTX 1050 Ti, since the price of $ 149 is very comparable to the introductory price of the GTX 1050 Ti at $ 139. Whatever the case may be, Turing cards from generation to generation have cost more than the Pascal cards that they have replaced, and the low price of these budget cards really amplifies this difference.

By diving into the numbers, the GTX 1650 comes with 896 CUDA enabled cores, spread across 2 GPCs. The GeForce GTX 1050 series does not have much to do with paper, but Turing's architectural modifications and the effective increase in graphics efficiency mean that the small card should have a little more punch than it does shows at the beginning. The CUDA cores themselves, however, are a little less clocked than for a Turing card, the GTX 1650 clocked at reference increasing to only 1665 MHz.

The package is completed by 32 ROP, which are part of the 4 groups of ROP / L2 / Memory of the card. This means that the card is powered by a 128-bit memory bus, which NVIDIA has paired with a GDDR5 memory clocked at 8 Gbps. Conveniently, this gives the card a memory bandwidth of 128 GB / s, about 14% more than the latest generation of GTX 1050 series cards. Fortunately, while NVIDIA did not do much to increase the memory capacity of other Turing cards, it is not the same for the GTX 1650: the minimum is 4 GB, instead of 2 GB very binding the GTX 1050. 4 GB is particularly spacious in 2019, but the map should not be as hopeless by memory as its predecessor.

Overall, on paper, the GTX 1650 is expected to provide about 60% of the next card performance of the NVIDIA product stack, the GTX 1660. In practice, I expect the two cards to be a bit closer than that – the scaling performance of the graphics processor is insufficient. It's quite 1 – 1 – but that's the approximate area we're looking at right now, until we can actually test the map.

Meanwhile, in terms of power consumption, the smallest member of the GeForce Turing stack is also the lowest power. NVIDIA has maintained its GTX xx50 cards at 75W (or lower) for a few generations now, and the GTX 1650 continues this trend. This means that, at least for cards running on NVIDIA reference clocks, an additional PCIe power connector is not needed and that the card can be powered only from the PCIe bus. This addresses the need for a card that can be inserted into base systems where a PCIe power cable is not available, or in low-power systems where a more power-hungry card is not appropriate. It also means that if discrete video cards are not as popular as in the past for HTPCs, for HTPC manufacturers who wish to follow this path, the GTX 1650 will also replace the GTX 1050 series in this market. .

Notice, product positioning and competition

Now let's move on to business, let's move to product positioning and hardware availability.

The GeForce GTX 1650 is a difficult launch for NVIDIA; this means that cards are shipped by retailers and in OEM systems from today. As NVIDIA low-end cards are typical, there are no reference cards or reference designs, so NVIDIA's board partners will do their own thing with their respective product lines. This will include overclocked factory cards that offer more performance, but will also require an external PCIe power connector to meet the larger power requirements of the cards.

Although it is a difficult launch however, in a very unorthodox (if not very devious) movement, NVIDIA chose not to allow the press to test GTX 1650 cards in advance. Specifically, NVIDIA retained the driver needed to test the card, which means that even though we could secure a card in advance, we would not have been able to execute it. We have cards on the way and we will do a review in due course, but at the moment we do not have more practical experience with the GTX 1650 cards than you, our readers.

NVIDIA has always considered the launch of low-end cards as a less important business than its high-end products, and the GTX 1650 is no different. In fact, the launch of this generation is particularly discreet: we have no image or even press, with which NVIDIA has chosen to inform us of the card by email. And although the large-scale fanfare is not necessary at this stage – it's a Turing map and the Turing architecture / features have been too much covered at this stage – it's rare that a map based on on a new GPU be launched without the reviewers expecting a start. to that. And that's for good reason: the reviewers offer neutral and independent analysis of the map and its performance. Therefore, it is not usually in the interest of buyers to get rid of their criticism – and, if so, it can raise red flags – but that is no less so.

Anyway, although I suggest that buyers expect about a week to prepare reviews, it is true that Turing is a known quantity. As previously mentioned, the paper specifications place the GTX 1650 at around 60% of the performance of the GTX 1660, and actual performance will likely be somewhat higher. NVIDIA mainly offers the card as an upgrade to the GeForce GTX 950 and its AMD counterparts of the same generation, which is the same upgrade rate gap as the rest of the family. GeForce Turing. NVIDIA claims that performance should be 2x (or more) faster than the GTX 950, which should be easily achieved.

While waiting to be able to get hold of a card to launch performance tests, the GTX xx50 series of cards is designed to be compromised 1080p cards, and I expect it to be even for the GTX 1650 based on we've seen with the GTX 1660. The GTX 1650 should be able to run some games in 1080p with optimal image quality – think about DOTA2, and so on. – but in more demanding games, I'm waiting to have to go back to some settings to stay in 1080p with framères playable One of the advantages of this solution lies in the fact that with its 4 GB of RAM (VRAM), it should not have as much trouble on newer games, unlike the GTX 950 and GTX 1050 2GB.

Finally, in terms of competition, AMD is of course the crew of the Radeon RX 500 series based on Polaris, so this is what the GTX 1650 will face. AMD tries very hard to configure the Radeon RX 570 8 GB compared to the GTX 1650, which makes it a very interesting battle. From what we saw with the GTX 1660, the RX 570 should work pretty well compared to the GTX 1650, and the 8 GB of VRAM would be the icing on the cake. However, I am not sure that AMD and its partners can necessarily reserve a card price of 8 GB at $ 149 or less. In this case, the competition could well become the RX 570 of 4 GB.

In the end, AMD's position will be this: although they can not compete with the GTX 1650 in terms of features or energy efficiency, let's not forget that the RX 570 is designed to consume nearly twice as much power here. it's about performance. As long as AMD wants to stay the course, this will be a good compromise for AMD on a purely price / performance basis for the games of the current generation. Although to see how much this could be beneficial, we will of course evaluate the GTX 1650, so stay tuned for that.

[ad_2]

Source link