First off why the HELL did you not include BF3 in the benchmarks?!?!?!?!? This card should be out March 22nd and it will be a hard launch, not a paper.http://wccftech.com/nvidia-kepler-geforce-gtx-680-benchmarked-blows-hd-7970/Here are some random benches found in the link:Good to finally see what the first model will be like, can't wait for March 22nd to see the reviews.
3/16/2012 3:55:15 PM
when will they be available and what will the price be?
3/16/2012 4:15:32 PM
from the OP
3/16/2012 4:43:27 PM
no 670 launch for us poor folk?
3/16/2012 5:28:49 PM
not at launch
3/16/2012 5:34:11 PM
You could have just fucking linked the article...Also, that's piss poor numbers right there. Disappointing to say the least. I thought nvidia was saying 2x the performance over current Fermi's and how it would just smoke the 7970's.
3/17/2012 1:29:24 AM
Also, 680's are suppose to be $649, GTX 670 and 660's should be released the same time frame at $499 and $349 price points, but all of it is just speculation; haven't seen anything from nvidia about this.
3/17/2012 1:47:57 AM
Jbaz you are behind the news, I didn't think I had to site the price ref:http://videocardz.com/31020/zotac-geforce-gtx-680-2gb-available-for-pre-order-for-e507why just post a link when you can see all the info in here that matters?Also I'd say for beta test drivers (remember the 460 went up almost 40% with one driver update) that is from a random website showing benchmarks with no indication of test method that they are favorable. [Edited on March 17, 2012 at 12:47 PM. Reason : _]
3/17/2012 12:45:30 PM
http://videocardz.com/31046/geforce-gtx-680-sli-performance-resultshttp://vr-zone.com/articles/nvidia-geforce-gtx-680-sli-performance-preview/15273.html
3/19/2012 3:56:12 AM
Serious (not a troll) question.What about this video card was thread worthy? I mean if we made threads for every new vid card, that's all this section would be.I really am interested to know why this one stands apart from the rest.
3/19/2012 10:06:37 AM
Because it's the first major Kepler card that's been benchmarked. Big releases like this only come out every couple years. aka:So it's not just another card.
3/19/2012 10:36:30 AM
Yeah, its a whole new chip design.
3/19/2012 12:30:13 PM
http://videocardz.com/31052/gigabyte-geforce-gtx-680-pictured-and-testedHmmm why is this thread worthy? Maybe because new GPU Architectures (not die shrunk, brand new) only come around every 2-3 years.
3/19/2012 1:37:00 PM
Does this architecture allow for more than 2 displays without a second card?
3/19/2012 3:04:04 PM
yes, it should. They noted this awhile back that they were trying to do 3 multi-monitor gaming on a single card to go against AMD cards, but I haven't heard too much about it in the specs or listings lately.
3/19/2012 3:55:31 PM
I got an iPad so no more need for a desktop computer.
3/19/2012 4:11:03 PM
I got an iPad a pen so no more need for a desktop computer gun.
3/19/2012 4:31:30 PM
Come on pick up on my sarcasm. Truthfully I may pick up a second 580 down the road but I'm happy with my single 580 at the moment. I packed up Skyrim with a shit ton of HD texture mods and I'm hitting 2GB+ of VRAM constantly. I doubt the first few 680s will have 3GB of VRAM.-Like that Gigabyte only has 2GB of VRAM. [Edited on March 19, 2012 at 4:40 PM. Reason : s]
3/19/2012 4:36:26 PM
I'm still trying to figure out why it's at 256-bit instead of 384-bit?!?!! or at least 320-bit!?!?!
3/19/2012 4:52:50 PM
when your VRAM runs 6ghz you can save a lot of money by going with a 256bit, the question is how much does this hurt performance? We can simulate it by OCing the memory further and seeing how much gain we get or wait for the GTX 685 with its rumored 384bit + 6ghz+ Memory
3/19/2012 5:56:46 PM
3/20/2012 1:59:34 PM
I understand they increased the memory clock, what I'm asking is, why not in addition increase the bandwidth, it would seem that could make this architecture really scream and see those 200% increases like NVIDIA suggested it would have (which it does not).If anything it seems like NVIDIA is just sitting on the bandwidth so they can squeak out even more money from people as they know people would buy it even if it was 20% faster... so they'll slowly increase memory bandwidth over the next 2 years until they release Maxwell.[Edited on March 20, 2012 at 3:16 PM. Reason : .]
3/20/2012 3:15:16 PM
the 685 is rumored to have 512bit, we know for sure that the 680 has 256bit without a doubt. The GK110 is what you are talking about and is detailed here:http://videocardz.com/31126/geforce-kepler-gk110-specification
3/22/2012 12:19:49 AM
Interesting, but I'm passing on this. Maybe in a year or two.
3/22/2012 9:28:02 AM
Well yeah you have 2 3GB 580's. You'll still have the advantage over a single 680 or 685. Which reminds me I need to pick up a second one before they're all gone.
3/22/2012 9:36:30 AM
I'm just hoping that once the 600's come out, the water blocks for the 500's go uber cheap like the 400's did; grab three at MC while I'm in DC or something next month. Damn three cards stacked on top of each other is too hot, specially when my case doesn't have a side vent fan.
3/22/2012 4:17:07 PM
Official previews are out! It's confirmed, only sports a 256 bit wide memory bus. Performs a little bit better on average over the 7970's, but not by a huge stance of what most people were thinking; good news is that it's going to hit the $499 price point! GTX 685 is what the 680 is suppose to be, but since nvidia wanted to counter the 7970 asap, they just pulled this out of their ass; not to shabby though. And as everyone has mentioned from the leaked previews, it does in fact sip much less power than the 580's and even the 7970; 195w TDP!?!?!http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review
3/22/2012 5:49:59 PM
http://www.xbitlabs.com/articles/graphics/display/nvidia-geforce-gtx-680_2.htmlhttp://www.guru3d.com/article/geforce-gtx-680-review/1[Edited on March 22, 2012 at 8:01 PM. Reason : _]
3/22/2012 7:58:40 PM
BIG PICKTUR: http://images.anandtech.com/doci/5699/GeForce_GTX_680_F_No_Thermal.jpg[Edited on March 22, 2012 at 8:36 PM. Reason : _]
3/22/2012 8:34:22 PM
GPGPU computations are said to be double the performance over the 580's, so this got everyone in the folding community going ape shit. Should provide a good home for the Jaguar super computer in TN.
3/22/2012 8:41:57 PM
3/23/2012 3:38:48 PM
I thought Compute was less of a concern on Kepler and that AMD is holding the lead right now in GPGPU performance benchmarks.http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17
3/24/2012 6:14:37 PM
first of all, geforce uses the limited 16 bit FPU's so their GPGPU performance won't be as high for double precision FPO's. This is why you pay the $$$ for Quadro and Tesla cards; even then, the older fermi quadro's are much faster than the 7970, but their price/performance is at a steep cost although you can string multiple quadro/telsa's for a better performance per box than AMD's cards when speed is absolute.Second, GPGPU performance can be misleading in performance since it depends on what you are rendering and how you are rendering it. Some scenes could be limiting the gpgpu performance because of a slow cpu and vice versa. It depends on what software you are running, what you are rendering and how efficient the coding is.Third, AMD has been known to be very poor performance with certain OpenCL applications, even though the theoretical throughput is technically higher than nvidia's. They had these issues with the 5k series because of how the FPU's were designed and it would starve the steam processors; they were terribly complicated to code for since the micro management is software based, not hardware like nvidia's CUDA or the their later 6,7k series. Lastly, CUDA is a much more used platform for GPGPU apps right now because the standard has been around a bit longer and many pro level software took advantage of CUDA straight off the bat. There's still some hesitation to switch over to OpenGL even though both nvidia and AMD supports it; its just a relatively new standard that finally got a full release in 2009. So it took a bit to finally see some apps that use it today. Still being adopted at a slow pace.But all of this doesn't mean that the Geforce lines, like the 680 is terrible for GPGPU applications, just for double floating point calculations that require a higher level of math, requiring 4 or 8 times of cycles for one operation; like what the smallluxGPU plugin for 3ds max requires for ray tracing. In some applications like physics and modeling (like folding), CUDA is just much more advance and faster than openCL right now, but that could change once openCL gets better, more advanced and streamlined all while the hardware tech is specifically built for its code.tl;dr
3/24/2012 7:20:28 PM
you aren't responding to any of the data from the review or their opinions:
3/25/2012 9:50:14 PM
3/26/2012 8:28:42 AM
get a room already you two
3/26/2012 9:29:39 AM
It looks pretty baller thus far according to guru3d and tom's hardware. I may wait until closer to the launch of Diablo 3 before doing this next build that way they can shake out all of the bugs but more importantly get some in stock.
3/29/2012 8:32:57 PM
Diablo 3 will not stress the 680. Shit, a cheapo 550 ti will work
4/12/2012 6:47:34 AM
I wonder if the original star craft can run on high with these?
4/12/2012 7:30:46 AM
I wonder if the original star craft can run on high with these?//found an egg..double post..[Edited on April 12, 2012 at 7:31 AM. Reason : egg]