Go Back   TechArena Community > Hardware > Monitor & Video Cards
Become a Member!
Forgot your username/password?
Register Tags Active Topics RSS Search Mark Forums Read

Sponsored Links



Details about Nvidia GeForce GTX 680

Monitor & Video Cards


Reply
 
Thread Tools Search this Thread
  #1  
Old 27-04-2012
Member
 
Join Date: Feb 2012
Posts: 68
Details about Nvidia GeForce GTX 680
  

this card has been out for the while now and it has been said that this card is fastest GPU, and also the world's most power efficient in the article that I have read on the web in the Nvidia site “Introducing the GeForce GTX 680”, now I am planning to build my own system and I am thinking to use this card in that build but then again, I don’t want to base my decision just on the basis on something that is mentioned on the their own site, so I was just wondering if anyone here has any information about this card so that I can make my decision to whether to use this card or not on my system, that I am going to build. Some benchmark or feature will be nice , I will be really appreciating this …. Thanks in advance .

Reply With Quote
  #2  
Old 27-04-2012
Member
 
Join Date: May 2011
Posts: 222
Re: Details about Nvidia GeForce GTX 680

The nVidia GeForce GTX 680 is based on the GPU GK104 from the Kepler generation. This GPU, like AMD's latest generation GPUs produced using 28nm transistors. If the rumors about Nvidia's upcoming graphics cards in recent months has closely followed, will know that GK104 was actually intended for new mid-range video cards. A chip codenamed GK110 was on the schedule as new high-end chip and as a direct successor of the GeForce GTX 580 GF110.


What exactly those GK110 is going on is unclear, but one thing is certain: it is not possible to guarantee the nVidia chip functioning produced to get and so the GK110 postponed until further notice. Perhaps we developed a variant of it ever going to encounter as GeForce GTX 780. But the fact that the mid-range GK104, originally intended fro the GeForce GTX 660 video card-like Ti' intended, appeared to perform much better than originally expected. So good that even with this less complex nVidia GPU performance of AMD’s Radeon HD 7970 can matches. The GK104 is a 28 nm chip with 3.54 billion transistors. In comparison, the 40nm GPU, the GPU of the GeForce GTX 580 was around 3 billion. Tahiti AMD's Radeon HD 7970 GPU has more transistors: 4.3 billion.


Although nVidia the chip-size of the GK104 not officially has disclosed, we can on the basis of the number of transistors to conclude that the chip smaller and therefore cheaper to produce than that of AMD. The GK104 has a 256-bit memory: a step back compared to the GF110, but understandable when you consider that the chip and mid-range is developed. The interface of the chip is PCI Express 3.0.
Reply With Quote
  #3  
Old 27-04-2012
ddj ddj is offline
Member
 
Join Date: Jul 2011
Posts: 246
Re: Details about Nvidia GeForce GTX 680

Well the GeForce GTX 680 graphics card , The GPU works at standard 1006 MHz on the map, but the story does not take off: thanks to the GPU Boost function, which on the following pages more, the GPU in almost all cases faster. The GeForce GTX 680 is equipped with 2 GB of GDDR5 memory operating at 1502 MHz. That really is extremely fast. When you consider that the memory at 1002 MHz on the GTX 580 worked, we may actually conclude that the step back from 384-bit to 256-bit memory is completely offset by the increased clock frequency.

The card has a maximum power consumption (TDP) of 195 watts. That is really a new trend: the TDP of the GTX 580 stood at 244 watts. Thus, the GTX 680 also enough for two 6-pin PEG connectors. Moreover, connectors that are above and thus do not next to each other on the card. Because just one connector is deeper than the other, you can very easily connect two cables. Card provides DVI twice, once HDMI and DisplayPort 1.2 times. Important to mention it directly: it is finally possible to actually use all the connections simultaneously, about which more later. The HDMI output is suitable for what AMD calls Fast HDMI, thus leaving 4K resolutions can be sent. nVidia refuses the term Fast HDMI to take over. Naturally, the two GTX 680 SLI connectors. SLI, Triple SLI and Quad SLI are thus all possible. The cooler has been re-designed by nVidia, and as it i s very quiet.
Reply With Quote
  #4  
Old 27-04-2012
Member
 
Join Date: Jul 2011
Posts: 314
Re: Details about Nvidia GeForce GTX 680

the default clock frequency of the GeForce GTX 680 is 1006 MHz but A new technology called GPU Boost ensure that the GPU faster work if needed and more slowly when possible. In the basic GPU Boost is a feature similar to , for example Intel's Turbo Boost. The GPU constantly monitors the power consumption and temperature of the GPU. If both values allow it, then the clock frequency increased in small steps. I don’t know about the maximum clock frequency with GPU Boost achieved. That Boost Clock is the GeForce GTX 680 1058 MHz.

GPU Boost also work the other way. If only a limited amount power of the GPU is there and the GPU is able to achieve the required performance with a lower clock frequency, it switches itself back to save energy. this can be seen happen particularly if you use the new capability Target Frame Rate option, which you include with the EVGA Precision overclocking software setup. You can then specify that the GPU should try to calculate 60 frames per second and not more (because your monitor can not show more than 60 fps). There is then no longer wasted energy to frames that you do not see.

GPU Boost you cannot disable. If you have a GeForce GTX 680 card will overclock, overclocking Turbo all values in fact at stake. When you bolt on 100 MHz clock frequency for example, the default frequency 1106 MHz and 1158 MHz Boost Clock is the clock frequency to be expected in games a little 1200 MHz. this is really a nice feature that I really liked
Reply With Quote
  #5  
Old 27-04-2012
Member
 
Join Date: Jul 2011
Posts: 287
Re: Details about Nvidia GeForce GTX 680

I have the GeForce GTX 680 in my new test system, that has the
  1. Intel Core i7-3960X
  2. ASUS motherboard,
  3. 16 GB Vengeance DDR3 Memory
  4. SATA hard drive
  5. Cooler Master
  6. Windows 7 x64.

And when I played the game on the four settings that is 1680x1050, 1680x1050 with 4x AA, 1920x1080 and 1920x1080 with 4x AA, always using the highest quality settings the gaem was running smooth . Well I have the about $ 100 Full HD monitor , so all the test that I have done is on the higher resolution . Well today I was testing higher resolutions tests and like to say that you can indeed have a good setup of three Full HD screens with thin border purchase. Hence 5760x1080 (three HD screens next to each other) now also a standard part of our tests. Good news that this is now also a nVidia video card supports. In both games with 'normal' settings with the best settings (highest / extreme / ultra), which we in the latter case, where possible, 4x AA on.
Reply With Quote
  #6  
Old 27-04-2012
Member
 
Join Date: Aug 2011
Posts: 302
Re: Details about Nvidia GeForce GTX 680

Well I tried to overclock this cards and During our overclocking attempts, I performed on the GPU voltage to 1.075 volts. For the GeForce GTX 680 overclocking, i used a new version of EVGA Precision. When I overclocking the GeForce GTX 680, I got it to work differently than conventional cards. First you need to set the video card may consume more power than the TDP, otherwise the card during your overclocking will try clocks itself. In accuracy mark can be done with the Power slider. Overclocking can continue with the GPU Clock Offset. This makes all the GPU clock frequencies higher, the base clock, the clock and all turbo boost values. If you use the CPU at 100 MHz Clock Offset example set, the default clock frequency 1106 MHz, 1158 MHz and the clock will boost the video card in practice governed the 1200 MHz tap. The overclocking of the memory goes with the Mem Clock Offset. You should be here just keep in mind that the accuracy software calculates DDR values. If you set the offset at 100 MHz, is in fact at 50 MHz.
Reply With Quote
  #7  
Old 27-04-2012
Member
 
Join Date: Aug 2011
Posts: 251
Re: Details about Nvidia GeForce GTX 680

Well this is the A mid-range GPU that once produced the best performance shows that even in the high-end segment can shine. Well I feel NVIDIA GeForce GTX 680 can be good . it scales very well! On average, maybe less than AMD Radeon HD 7970 Crossfire, but in 5760x1080 with highest settings - where SLI really comes in handy - just a little better. For those who only a single monitor has a two card actually overkill. When you go play on three screens, then comes a second GTX 680 definitely important when you almost always want to stay above 60 fps without (too much) to do the sacrificing image quality. The memory interface of the GTX 680 is as a performance chip usual only 256 bits wide, it combines Nvidia this with faster GDDR5 memory than even the GeForce GTX 580 Our reference model uses Hynix 'R0C-RAM, which according to specifications with 3.0 GHz (6 Gbps) works. Thus, the GTX 680 over the same transfer rate as the GTX 580 .
Reply With Quote
  #8  
Old 27-04-2012
Member
 
Join Date: May 2011
Posts: 395
Re: Details about Nvidia GeForce GTX 680

The GTX 680, has the Kepler-chip the hardware encoder for low-power transcoding of videos. Nvidia wants to make his own words later, the choice of integrated Intel CPUs Quick Sync feature. Furthermore, with TXAA announced an anti-aliasing technique, but for which support is needed by the application and therefore we have not been able to test themselves. Apart from Epic Games (and the Unreal Engine 4) provides support for NVIDIA Developer TXAA in Borderlands 2, Mechwarrior Online, Eve Online, and from developers Crytek, Slant Six Game, Bit Squid and the MMO the Secret World in prospect. TXAA it is based on hardware multi-sampling, and also allow for HDR rendering proper downsampling - optionally, a sample jittering allow a kind of super-sampling. Nvidia advertises TXAA currently two methods: the first is expected to cost about as much power as 2x MSAA, smooth edges, but already better than 8x MSAA. TXAA2 will cost as much as 4x MSAA, but offer a quality beyond 8x MSAA
Reply With Quote
  #9  
Old 27-04-2012
Member
 
Join Date: Dec 2011
Posts: 74
Re: Details about Nvidia GeForce GTX 680

The GTX 680 was compared to the previous architecture, "Fermi" totally rebuilt. The basic functional units remained the same . The unit is a single shader ALU, Nvidia Cuda likes such things as core. These are summarized together with caches, storage tanks, texture and other auxiliary units for Load-/Store-Operationen and special functions in groups, called the Nvidia Shader-/Streaming- multi Processors shortly SM, Kepler and the resonant "SMX" promoted. The next larger organizational unit called GPC, short for Graphics Processing Cluster, and combines the SM (X) with its own setup and raster units. The interface to the storage form in the first stage of a partitioned into 128 blocks kiB Level 2 cache (R / W) are flanged to the parts of each, a 64-bit memory controller and the grid operators in groups. Basically, Nvidia has placed in the previously known Kepler architecture (GK10x) the focus on maximum energy efficiency units have been saved, more functional units. The theoretical performance increased a lot - but as it is with the efficiency?
Reply With Quote
Reply

  TechArena Community > Hardware > Monitor & Video Cards
Tags: , , , ,



Thread Tools Search this Thread
Search this Thread:

Advanced Search


Similar Threads for: "Details about Nvidia GeForce GTX 680"
Thread Thread Starter Forum Replies Last Post
Graphic performance of Nvidia GeForce 9800 GTX is poor than old nvidia geforce gt220 Mr.Restless ! Monitor & Video Cards 6 19-10-2011 11:05 PM
Which graphic card has better performance NVIDIA GeForce GT 230M or NVIDIA GeForce GT 320M Kordell Monitor & Video Cards 6 26-09-2010 04:40 AM
NVidia GeForce 9400GT 1 GB Pci Graphics Card Vs Agp nVidia GeForce 6200 LE 512MB Lyandon Monitor & Video Cards 7 26-09-2010 04:40 AM
NVidia GeForce G310M Graphics Card vs. nVidia GeForce 8600 series Strangers Monitor & Video Cards 5 25-09-2010 10:37 PM
Which is better graphic card between Nvidia Geforce G210 and Nvidia Geforce 7600GT Charites Monitor & Video Cards 5 29-03-2010 01:15 PM


All times are GMT +5.5. The time now is 01:42 AM.