• Any person who have literally built a desktop or dealt in the purchasing of the graphics card in the present, or in the recent past, have two major choices - Nvidia and AMD.

Any person who have literally built a desktop or dealt in the purchasing of the graphics card in the present, or in the recent past, have two major choices - Nvidia and AMD. (Photo : Reuters)

Nvidia recently announce that the company's newest and probably most powerful graphics card, the GTX Titan X, will hit the stores on Mar. 18. The new graphics card will cost $999 in the United States market and £879 in the United Kingdom.

The Titan X will be Nvidia's first ever graphics card that will have 12GB of GDDR5 video-RAM. With its massive power output, Nvidia's Titan X GPU has been hailed by many industry analysts as the current king when it comes to gaming related GPU's. It also has more shaders compared to the GeForce GTX 980 which uses a Maxwell GPU.

Like Us on Facebook

According to Nvidia, the Titax X has full Direct X 12 capabilities and is built in a way that it can support virtual-reality. In fact, the Titan X was used in most of the virtual-reality presentations during the Game Developers Conference.

According to PC World, the Titan X uses an "uncut" version of the GM200 which is used in the GeForce GTX 980. This gives the Titan X 3,072 CUDA cores which is significantly higher compare to the 2,048 present on the GTX 980. The report added that the practice of cutting or disabling some portions of a microchip are some company's strategy in order to increase yield and for various pricing reasons as well.

On top of the 3,072 CUDA cores are the following integrated features; 96 render output unit, 24 system management mode units and 192 texture mapping units. This extremely powerful specifications is brought by the GM200's six graphics processing clusters with each cluster containing four streaming multiprocessor units and 128 cores in each SMM block.

In order to handle this energy demanding hardware, the Titan X has a 250 watts thermal design power which needs a 6-pin and an 8-pein power from the power supply. To avoid wattage problems Nvidia recommends users to use a 600 watts power supply, according to Game Spot.