Sunday, November 27, 2005

dacaldar writes "According to Yahoo Finance, noted Finnish over-clockers Sampsa Kurri and Ville Suvanto have made world history by over-clocking a graphics processor to engine clock levels above 1 GHz. The record was set on the recently-announced Radeon® X1800 XT graphics processor from ATI Technologies Inc."Ads_xl=0;Ads_yl=0;Ads_xp='';Ads_yp='';Ads_xp1='';Ads_yp1='';Ads_par='';Ads_cnturl='';Ads_prf='page=article';Ads_channels='RON_P6_IMU';Ads_wrd='graphics,hardware';Ads_kid=0;Ads_bid=0;Ads_sec=0; Overclocked Radeon Card Breaks 1 GHz Log in/Create an Account | Top | 175 comments | Search Discussion Display Options Threshold: -1: 175 comments 0: 172 comments 1: 121 comments 2: 80 comments 3: 37 comments 4: 24 comments 5: 21 comments Flat Nested No Comments Threaded Oldest First Newest First Highest Scores First Oldest First (Ignore Threads) Newest First (Ignore Threads) The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way. Huzzah! (Score:5, Funny) by Anonymous Coward on Wednesday October 26, @04:45PM (#13883998) What a day for world history! It will be remembered forever! [ Reply to ThisRe:Huzzah! by Spy der Mann (Score:3) Wednesday October 26, @05:05PMRe:Huzzah! by Anonymous Coward (Score:2) Wednesday October 26, @08:08PM Awesome! (Score:1, Funny) by Anonymous Coward on Wednesday October 26, @04:46PM (#13883999) First Duke Nukem Forever postFirst Imagine a Beowulf Cluster of those postFirst Can Netcraft confirm that? post [ Reply to ThisRe:Awesome! by Anonymous Coward (Score:1) Wednesday October 26, @05:17PMRe:Awesome! by jzeejunk (Score:1) Wednesday October 26, @05:27PMRe:Awesome! by someone300 (Score:1) Wednesday October 26, @05:34PMRe:Awesome! by Xua (Score:1) Wednesday October 26, @06:46PM1 reply beneath your current threshold.Re:Awesome! by JustADude (Score:1) Wednesday October 26, @07:42PMRe:Awesome! by c_forq (Score:1) Wednesday October 26, @08:21PM3 replies beneath your current threshold. One wonders... (Score:5, Interesting) by kko (472548) <fernando.andres@ ... LAnet minus caff> on Wednesday October 26, @04:46PM (#13884000) (http://slashdot.org/ | Last Journal: Sunday July 25, @11:12PM) why this announcement would come out on Yahoo! Finance [ Reply to ThisRe:One wonders... by JamesTRexx (Score:2) Wednesday October 26, @04:49PMRe:One wonders... by grub (Score:1) Wednesday October 26, @04:50PM Re:One wonders... (Score:5, Insightful) by Jarnis (266190) on Wednesday October 26, @04:51PM (#13884056) ... because ATI made a big press release about it.Since their product is still mostly vapor (you can't buy it yet), and nVidia is currently owning them in the high end market because ATI's product is so late, one has to grasp straws in order to try look l33t in the eyes of the potential purchasers.Wish they'd spend less time yapping and more time actually putting product on the shelves.Nice overclock in any case, but ATI putting out a press release about it is kinda silly [ Reply to This | ParentRe:One wonders... by Eugene (Score:3) Wednesday October 26, @05:51PM1 reply beneath your current threshold.1 reply beneath your current threshold. No big surprise (Score:5, Funny) by Z0mb1eman (629653) on Wednesday October 26, @04:46PM (#13884001) (http://www.ratemyparty.com/) I didn't have Slashdot in a full screen window, so the headline read:Overclocked Radeon Card Breaks1 GHzWas wondering why an overclocked card breaking is such a big deal :p [ Reply to This Benchmarks? (Score:5, Funny) by fishybell (516991) <fishybellNO@SPAMhotmail.com> on Wednesday October 26, @04:47PM (#13884009) (http://www.fishybell.com/ | Last Journal: Wednesday October 26, @02:11PM) Without the pretty graphs how will I know what's going on?! [ Reply to ThisRe:Benchmarks? by b0r1s (Score:2) Wednesday October 26, @04:49PM Re:Benchmarks? (Score:5, Informative) by Mr. Sketch (111112) <mister.sketch@gmail. c o m> on Wednesday October 26, @04:58PM (#13884102) Graphic showing 3DMark score of 12419:http://www.muropaketti.com/3dmark/r520/12419.png [muropaketti.com]Pictures of their setup/methods:http://www.muropaketti.com/3dmark/r520/ghz/ [muropaketti.com] [ Reply to This | ParentRe:Benchmarks? by PhotoBoy (Score:2) Wednesday October 26, @06:31PMRe:Benchmarks? by diablomonic (Score:3) Wednesday October 26, @07:57PMRe:Benchmarks? by KingOfGod (Score:1) Wednesday October 26, @05:10PMRe:Benchmarks? by netkid91 (Score:1) Wednesday October 26, @05:23PMRe:Benchmarks? by roadrunnerro (Score:1) Wednesday October 26, @07:43PMRe:Benchmarks? by netkid91 (Score:1) Wednesday October 26, @08:09PM GPU to excel CPU (Score:1) by hadj (926126) on Wednesday October 26, @04:47PM (#13884013) (http://www.archintellect.net/) The performance of GPU's seem to grow faster than those of CPU's. I remember someone had proposed to use GPU's to proces generic data. It would be 12 times faster than a CPU. [ Reply to ThisRe:GPU to excel CPU by TEMM (Score:3) Wednesday October 26, @04:51PMRe:GPU to excel CPU by steveo777 (Score:3) Wednesday October 26, @04:58PMRe:GPU to excel CPU by LLuthor (Score:3) Wednesday October 26, @05:04PMRe:GPU to excel CPU by Isca (Score:1) Wednesday October 26, @05:10PM Re:GPU to excel CPU (Score:5, Informative) by Mostly a lurker (634878) on Wednesday October 26, @05:51PM (#13884522) A good question! This excerpt from a recent article in Extreme Tech [extremetech.com] seems relevant:The third future project at ATI is dramatically improved support for the GPGPU scene. These are researches, mostly academic, that are tapping into the massive parallel computing power of graphics processors for general computing tasks, like fluid dynamics calculations, protein folding, or audio and signal processing. ATI's new GPU architecture should be better at GPGPU tasks than any that has come before, as it provides more registers per pipeline than either ATI's old architecture or Nvidia's new one. This is a sore spot for GPGPU developers but not really a limitation for game makers. The improved performance of dynamic branching in the new architecture should be a huge win for GPGPU applications as well. Developers working to enable general purpose non-graphics applications on GPUs have lamented the lack of more direct access to the hardware, but ATI plans to remedy that by publishing a detailed spec and even a thin "close to the metal" abstraction layer for these coders [ Reply to This | ParentRe:GPU to excel CPU by Saiyine (Score:1) Wednesday October 26, @05:03PMRe:GPU to excel CPU by OzPeter (Score:3) Wednesday October 26, @05:07PMRe:GPU to excel CPU by Jerry Coffin (Score:3) Wednesday October 26, @05:09PMAlso the precision of GPUs is limited by renoX (Score:2) Wednesday October 26, @06:01PMRe:Also the precision of GPUs is limited by be-fan (Score:2) Wednesday October 26, @06:12PMRe:GPU to excel CPU by 2short (Score:2) Wednesday October 26, @05:46PMRe:GPU to excel CPU by Intocabile (Score:1) Wednesday October 26, @10:11PM Speed play offs. (Score:4, Funny) by neologee (532218) on Wednesday October 26, @04:48PM (#13884019) (http://schizoslim.blogspot.com/) I always knew ati would finnish first. [ Reply to ThisRe:Speed play offs. by eosp (Score:1) Wednesday October 26, @05:38PM3 replies beneath your current threshold. A bit presumptious? (Score:5, Insightful) by syphax (189065) on Wednesday October 26, @04:49PM (#13884029) (Last Journal: Wednesday August 18, @06:22PM) have made world historyI think that's going a bit far. Good for them and everything, but world history? V-E day, Einstein's 1905, Rosa Parks refusing to give up her seat on the bus- these events impact world history (sorry for the all-Western examples); making a chip oscillate faster than an arbitrary threshold does not. [ Reply to This Re:A bit presumptious? (Score:5, Funny) by bcattwoo (737354) on Wednesday October 26, @04:57PM (#13884097) What are you talking about? I'm sure I will have next October 26 off to celebrate Overclocked Radeon Broke 1GHz Barrier Day. Heck, this may even become Overclocked GPU Awareness Week. [ Reply to This | ParentRe:A bit presumptious? by Pulzar (Score:1) Wednesday October 26, @05:13PMRe:A bit presumptious? by Dirtside (Score:2) Wednesday October 26, @06:38PM1 reply beneath your current threshold.Re:A bit presumptious? by Spy Handler (Score:2) Wednesday October 26, @08:37PM2 replies beneath your current threshold. 3D Mark? (Score:1) by SirDrinksAlot (226001) on Wednesday October 26, @04:49PM (#13884032) (Last Journal: Wednesday January 22, @08:06PM) I'm not going to take that seriously until I see some actual 3DMark results. I can overclock my 9800Pro to some insane speeds but once I start to push it I get all kinds of corruption. [ Reply to ThisRe:3D Mark? by Ironsides (Score:2) Wednesday October 26, @05:37PMRe:3D Mark? by SirDrinksAlot (Score:1) Wednesday October 26, @06:31PMRe:3D Mark? by diablomonic (Score:2) Wednesday October 26, @08:03PM1 reply beneath your current threshold. Global Warming (Score:1, Redundant) by koick (770435) on Wednesday October 26, @04:49PM (#13884037) AHA! There's the proverbial smoking gun for global warming! [ Reply to This1 reply beneath your current threshold. The culprit (Score:5, Funny) by ChrisF79 (829953) on Wednesday October 26, @04:50PM (#13884043) (http://www.understandfinance.com/) I think we've found the source of global warming. [ Reply to ThisRe:The culprit by Buzz_Litebeer (Score:1) Wednesday October 26, @05:33PM1 reply beneath your current threshold.Re:The culprit by Idontpostmuch (Score:1) Wednesday October 26, @05:29PM1 reply beneath your current threshold.2 replies beneath your current threshold. World history? (Score:3, Insightful) by Seanasy (21730) on Wednesday October 26, @04:51PM (#13884053) ... have made world history... Uh, it's cool and all but not likely to be in the history books. (easy on that hyperbole, wiil ya) [ Reply to ThisRe:World history? by FidelCatsro (Score:3) Wednesday October 26, @05:00PMRe:World history? by ichigo 2.0 (Score:1) Wednesday October 26, @05:41PMRe:World history? by SeaFox (Score:2) Wednesday October 26, @07:35PM Someday people will ask... (Score:5, Funny) by Keith Mickunas (460655) on Wednesday October 26, @04:51PM (#13884054) (http://www.mickunas.net/) where were you when the first video card was overclocked to 1GHz. And most people will respond "huh?".Seriously, "world history"? There's no historical significance here. It was inevitable, and no big deal. [ Reply to This Historical? (Score:1, Insightful) by Anonymous Coward on Wednesday October 26, @04:51PM (#13884060) How is this any more historical than overclocking it to 993 mhz? Its not! 1ghz is just a nice round number. It I overclock one to 1.82 ghz tomorrow, no one will care! [ Reply to ThisRe:Historical? by Anonymous Coward (Score:1) Wednesday October 26, @05:05PMRe:Historical? by Experiment 626 (Score:2) Wednesday October 26, @06:12PMRe:Historical? by micpp (Score:1) Wednesday October 26, @06:19PMRe:Historical? by mithras the prophet (Score:2) Thursday October 27, @12:33AM We'll just see (Score:5, Funny) by crottsma (859162) on Wednesday October 26, @04:52PM (#13884064) NVidia will make a competitve model, with blackjack, and hookers. [ Reply to ThisRe:We'll just see by MmmmAqua (Score:2) Wednesday October 26, @04:58PMRe:We'll just see by forkazoo (Score:2) Wednesday October 26, @06:58PMRe:We'll just see by Jerry Coffin (Score:1) Wednesday October 26, @05:20PMRe:We'll just see by leoxx (Score:2) Wednesday October 26, @06:14PM2 replies beneath your current threshold. What's the point of these tests? (Score:3, Interesting) by pclminion (145572) on Wednesday October 26, @04:52PM (#13884067) If you cool a chip, you can make it run faster. This is a matter of physics that doesn't need to be tested any more than it already has been. In some small way I appreciate the geek factor but I'm far more interested in geek projects that have some practical use.And as for being the first people in the world to do this... the chances of that are small. I'm sure there are people at Radeon (and other companies) who have done things far more bizarre, but didn't announce it to the world. [ Reply to ThisRe:What's the point of these tests? by spencerogden (Score:2) Wednesday October 26, @05:00PMRe:What's the point of these tests? by pclminion (Score:2) Wednesday October 26, @05:07PMRe:What's the point of these tests? by LesPaul75 (Score:2) Wednesday October 26, @05:12PMRe:What's the point of these tests? by pclminion (Score:2) Wednesday October 26, @05:35PMRe:What's the point of these tests? by slackmaster2000 (Score:1) Wednesday October 26, @05:12PMRe:What's the point of these tests? by wolrahnaes (Score:2) Wednesday October 26, @06:14PMRe:What's the point of these tests? by aka1nas (Score:1) Wednesday October 26, @09:33PMRe:What's the point of these tests? by wolrahnaes (Score:2) Wednesday October 26, @11:07PM1 reply beneath your current threshold.1 reply beneath your current threshold. Not for the weak (Score:5, Insightful) by Mr. Sketch (111112) <mister.sketch@gmail. c o m> on Wednesday October 26, @04:52PM (#13884071) The team, optimistic that higher speeds could ultimately be achieved with the Radeon X1800 XT, attained the record speeds using a custom-built liquid nitrogen cooling system that cooled the graphics processor to minus-80 degrees Celsius.It seems we may have a ways to go before it can be done with standard air cooling. I actually didn't think that operating temperatures for these processors went down to -80C. [ Reply to ThisWhy so cold? by elgatozorbas (Score:2) Wednesday October 26, @05:32PMRe:Why so cold? by gardyloo (Score:2) Wednesday October 26, @05:42PMRe:Why so cold? by elgatozorbas (Score:2) Wednesday October 26, @10:30PMRe:Why so cold? by gardyloo (Score:2) Wednesday October 26, @10:36PMRe:Why so cold? by elgatozorbas (Score:2) Wednesday October 26, @10:45PM1 reply beneath your current threshold.1 reply beneath your current threshold.1 reply beneath your current threshold.1 reply beneath your current threshold. comon now (Score:5, Funny) by Silicon Mike (611992) * on Wednesday October 26, @04:55PM (#13884084) If I could only go back in time and add liquid nitrogen to my 8088 processor. I know I could have gotten it up to 5.33 mhz, no problem. NetHack benchmarks would have been off the chart. [ Reply to This GPU vs. CPU Speed (Score:2, Interesting) by Anonymous Coward on Wednesday October 26, @04:56PM (#13884087) I've always wondered...Why have GPU speeds always been so much slower than CPU speeds?Are they made on a different process?Are they made with different materials?Are there signifigantly more transistors on a GPU?Why don't we have a 3Ghz GPU? [ Reply to This Re:GPU vs. CPU Speed (Score:4, Informative) by freidog (706941) on Wednesday October 26, @05:16PM (#13884241) Since DirectX 8 (I think), the color values have been floating point numbers, this is to avoid loosing a lot of possible values through all blending with multi-texturing and effects (fog, lighting ect) which are of course much slower than very simple integer calculations. Even on the Athlon64's FP add and muls are 4 cycles, you'd have to make the top end A64 about 700mhz if you make them single cycle execution. (multi-cycle instructions aren't as bad a thing on the CPU as there are plenty of other things to do while you wait, not so in GPUs).GPUs have also tended to focus on parallel execution - at least over the last few years - increasing the number of pixels done at the same time, to compensate for not being able to hit multi-ghz speeds, so yes they have many more transistors than typical CPUs (the 7800GTX might break 300 million, well over 250 million) - and of course heat is an issue if you push the voltage and / or clock speeds to far. The last few generations of GPUs have been up around 65-80W real world draw, more than most CPUs out there. And of course GPUs have very little room for cooling in those expansion slots. [ Reply to This | ParentRe:GPU vs. CPU Speed by pclminion (Score:2) Wednesday October 26, @05:54PMRe:GPU vs. CPU Speed by be-fan (Score:2) Wednesday October 26, @06:08PM Re:GPU vs. CPU Speed (Score:5, Insightful) by xouumalperxe (815707) on Wednesday October 26, @05:40PM (#13884426) Well, while the CPU people are finally doing dual core processors (essentially, two instruction pipelines in one die, plus cache et al), the GPU people have something like 24 pipelines in a single graphics chip. Why is it that the CPU people have such lame parallelism?To answer both questions. Graphics are trivial to parallelize. You know to start with that you'll be doing essentially the same code for all pixels, and each pixel is essentially independent from its neighbours. So doing one or twenty at the same time is mostly the same, and since all you need is to make sure the whole screen is rendered, each pipeline just needs to grab the next unhandled pixel. No syncronization difficulties, no nothing. Since pixel pipelines don't stop each other doing syncing, you effectively have a 24 GHz processor in this beast.On the other hand, you have an Athlon 64 X2 4800+ (damn, that's a needlessly big, numbery name). It has two cores, each running at 2.4 GHz (2.4 * 2 = 4.8, hence the name, I believe). However, for safe use of two processors for general computing purposes, lots of timing trouble has to be handled. Even if you do have those two processors, a lot of time has to be spent making sure they're coherent, and the effective performance is well below twice that of a single processor at twice the clock speed.So, if raising the speed is easier than adding another core, and gives enough performance benefits to justify it, without the added programming complexity and errors (there was at least one privilege elevation exploit in linux that involved race conditions in kernel calls, IIRC), why go multiple processor earlier than needed? Of course, for some easily parallelized problems, people have been using multiprocessing for quite a while, and actually doing two things at the same time is also a possibility, but not quite as directly useful as in the graphics card scenario. [ Reply to This | Parent1 reply beneath your current threshold. And you thought slashdot was bad (Score:1) by Edunikki (677354) on Wednesday October 26, @04:56PM (#13884090) After the LCD screen "news" earlier, I am glad to see that the unashamed marketting is submitted to Yahoo on this one ;). [ Reply to This It was 2D mode only (Score:5, Interesting) by anttik (689060) on Wednesday October 26, @04:57PM (#13884098) (Last Journal: Tuesday July 27, @02:47AM) Sampsa Kurri told in a Finnish forum that it was over 1 GHz only in 2D mode. They are trying to run it with same clocks later. ATI left some tiny details away from their press release... ;P [ Reply to This Re:It was 2D mode only (Score:5, Funny) by jandrese (485) * <kensama@vt.edu> on Wednesday October 26, @05:05PM (#13884161) (http://www.ceyah.org/~jandrese/ | Last Journal: Wednesday October 01, @02:47PM) It also apparently crashed a lot. This is kind of like saying "I got a Volkswagon Beetle up to 200kph[1]!!!" with a whole lot of modifications.[1] Going downhill [ Reply to This | Parent1 reply beneath your current threshold.Re:It was 2D mode only by slackmaster2000 (Score:2) Wednesday October 26, @05:19PM1 reply beneath your current threshold. Cheating (Score:2) by AC-x (735297) on Wednesday October 26, @04:58PM (#13884107)

0 Comments:

Post a Comment

<< Home