Jump to content

Home

Tech News and Gossip thread...


Negative Sun

Recommended Posts

Hmm.. looks like the pcie 3.0 spec has been delayed til 2011. This effectively means that the true next gen of GPU performance is over a year away... article at Reg Hardware

 

This effectively means that AMD/nvidia next line of high end GPUs, such as the "Geforce 300" will effectively have at least one year to gloat about their specs before pcie 3.0 kit comes out - and potentially tear it to shreds.. as PCIe 3.0 essentially has double the effective bandwitdth (7.99Gb/s) compared to the 4.0Gbs for spec version 2.

 

In CPU related news, some scores on Lynfield based Core i5/i7 performance have started trickling out - looks like Intel is on another winner. Has AMD become content to be always second in CPU performance. The promise K10 had compared to what was delievered is saddenning to an AMD fanboy :p:violin: Check out the article at Bright Side of News

 

mtfbwya

Link to comment
Share on other sites

  • Replies 835
  • Created
  • Last Reply
This effectively means that AMD/nvidia next line of high end GPUs, such as the "Geforce 300" will effectively have at least one year to gloat about their specs before pcie 3.0 kit comes out - and potentially tear it to shreds.. as PCIe 3.0 essentially has double the effective bandwitdth (7.99Gb/s) compared to the 4.0Gbs for spec version 2.
meh, i don't think bandwidth is the issue when it comes to 3D performance on most things. games tend to be GPU-bound in the first place, and giving more bandwidth between the CPU and GPU probably won't do much (although it certainly doesn't hurt anything). about the only area it will help is if you have a slower processor, but who in their right mind is going to pair an old CPU with a new mobo??
Link to comment
Share on other sites

meh, i don't think bandwidth is the issue when it comes to 3D performance on most things. games tend to be GPU-bound in the first place...

 

This is true for the way games are currently designed. Most games still only utilise a single core and work within an x86 environment. When devs pull their finger out and code for multicore/x64 then the pcie 3 spec will make more sense.

 

mtfbwya

Link to comment
Share on other sites

Yeah, I think that all of that bandwidth is intended to take full advantage of multi-core CPUs while playing multi-threaded games with multiple GPUs. I think. :p

 

IIRC, Core i7 really starts to pull ahead of Core2 Quad performance-wise while using multiple GPUs.

Link to comment
Share on other sites

IIRC, Core i7 really starts to pull ahead of Core2 Quad performance-wise while using multiple GPUs.

 

Yes, but only realised in niche graphical apps atm. Most gaming devs on the planet dont seem fussed with coding beyond 2006 yet :D

 

personally i am a fan of the msi hot anime gpu girls although this company that may or may not exist is good too..

 

I'd like to see that card in Tri-SLI :thmbup1:

 

mtfbwya

Link to comment
Share on other sites

Some more news on the Zune HD for those interested, this time from gadgetry site Cnet. longstoryshort version: the guy was impressed :p

"Mind you, it's not going to crush the iPod Touch--a product that for all intents and purposes is more mobile computer than media player." - Cnet guy

 

Hey he took the words right out of my mouth :thmbup1:Add the fact that the iPhone adds phoning and texting to mobile computing and iPod, and we have a winner!!! :xp:

 

 

i dont care as long as the boxes still have hot anime girls on them

Can't argue with that really...

Link to comment
Share on other sites

"Mind you, it's not going to crush the iPod Touch--a product that for all intents and purposes is more mobile computer than media player." - Cnet guy

 

Hey he took the words right out of my mouth :thmbup1:Add the fact that the iPhone adds phoning and texting to mobile computing and iPod, and we have a winner!!! :xp:

 

From a specs perspective, the Zune + Tegra GPU has more than enough clout to compete with the touch or the iphones multimedia capabilities. Comparing it to the iphone because it cant text and dial is simply not valid. The 'Cnet guy' has also done his best at comparing a pre release device with no apps or games etc on it compared to all the hijinks he must have on his touch/iphone. Let's hope he revisits his review in 12 months.

 

It's not huge surprise that MS has been working on attracting coders to work on zune apps as well.

 

I wouldnt be suprised if the next gen iphones also sported Tegra, as did the zune phones... or whatever they will be called ;)

 

mtfbwya

Link to comment
Share on other sites

Here's one for the speculators:

 

The oft outspoken Nvidia CEO Jen-Hsun Huang has revealed in a recent presentation that he believes GPU tech will be 570 times faster within the next 3 years. he compares this to a mere 3% performance gain to be achieved by CPUs. This latter figure is a bit more realistic and fits in with good ole 'Moore's Law'

 

Still, even a 57% increase in GPU performance over the next 3 years would sound more pausible than the figure JHH has quoted. Either he is sitting on some amazing tech we dont know about, or has been smokin' reefers (or has been misquoted!)

 

original report at TG Daily

 

mtfbwya

Link to comment
Share on other sites

What?!?! Do you mean to say that NVIDIA may not be able to make good on that 570x GPU performance increase promise?!?! Preposterous! Huang is on record now. That in essence entitles graphics kiddies like me. He better make good or I'm filing class action lawsuit on his butt!

 

;P

Link to comment
Share on other sites

What?!?! Do you mean to say that NVIDIA may not be able to make good on that 570x GPU performance increase promise?!?! Preposterous! Huang is on record now. That in essence entitles graphics kiddies like me. He better make good or I'm filing class action lawsuit on his butt!

 

;P

 

lolz, we had a prime minister that made the now infamous statement (in about '86) there would be no children living in poverty here in Oz by 1990.

 

Maybe JHH has been taking some inspiration from him... :p

 

I personally feel there is some context missing in that quote. Im not entirely convinced it means Geforce 6-7xx series are going to crunch >teraflops out of the box and destroy crysis at 1600p without even getting hot :D

 

mtfbwya

Link to comment
Share on other sites

  • 2 weeks later...

So I understand AMD's Evergreen launch event was yesterday. Starting to see some more info on the Radeon HD 5xxx cards. ALIENBABELtech has apparently posted specs ahead of the NDA expiration. Sounds like September 23 is the launch date.

HD 5870 offers over 1600 Stream processors. Amazingly AMD doubled the number of SIMD units from 10 to 20. Since each SIMD unit contains 16 5-D units and a Quad-TMU overall, that means we count 1600 stream processors and 80 TMUs. We are talking about a new videocard whose core speed is at 850 MHz and whose 256-bit GDDR5 runs at 1200 MHz – all for the suggested retail of $399! AMD is expecting HD 5870 to come close to the performance of a HD 4870-X2 or the GTX 295

The new HD 5850 will launch on September 23 also. We hear it is priced below $299. HD 5850 will sport 1440 stream processors and it will have lower clockspeeds than its big brother. We are hearing somewhere around 700/1000 MHz

So much for even coming close to matching the launch prices of the 4850 and 4870, eh? :¬:

Link to comment
Share on other sites

Yeah, I figured that this next generation would start out pricey. They are roughly doubling the performance over the previous generation, after all. Prices will definitely come down after both companies have launched their products and the price war begins.

Link to comment
Share on other sites

Anandtech - Beginnings of the Holodeck: AMD's DX11 GPU, Eyefinity and 6 Display Outputs

 

So what are everybody's thoughts on AMD's Eyefinity technology, where one GPU renders a display to 3-6 monitors? From what I've read it sounds like this tech works pretty well on AMD's new GPU's.

 

Simply due to price factors...6 display outputs are surely most useful for corporate, high end graphical industry and technical setups. I had to save for a damn year to get my 30" monitor... I definitely dont want to buy 6 of them, or a new eyefinity display for that matter... :p

 

Unless you are one of those weird dudes that accumulates many CRT monitors for Flight Simulator, or maybe just to simulate you are in the Space Shuttle :D

(nb.I didnt make this poster so dont blame the typo on me!)

 

spaceshuttle.jpg

 

The Holodeck comparison is a bit ridiuclous. There is no 3D. The closest you can get to that are systems such as heliodisplay, and this isnt really ready to enter the mainstream market/gaming world yet.......

 

(This BBCode requires its accompanying plugin to work properly.)

 

AMD really need to focus on two things.

 

*Making the performance competitive CPUs at best competitive price

*Making the performance competitive GPUs at best competitive price

 

I know they have a nice little slice of the server and mobile market, and forever rabbit on about their Duke Nukem-esque fusion CnGPU... but until they take care of the top two things, they will never get that 2004 mojo back :D

 

Heck, maybe they dont want it back???

 

mtfbwya

Link to comment
Share on other sites

AMD have started to let some details trickle out about the 5870. Definitely some impressive specs

 

source

Radeon_HD5870_03.jpg

 

Interesting to note that the Geforce 300 spec-speculation, which was reported a few pages back also clocked it in at 3TFlops and a 2Ghz core clock ;) Time will tell if team green can get their hands on some decent silicon to actually make it happen :D

 

Heres another article looking at the 5870 from a pricing perspective

 

mtfbwya

Link to comment
Share on other sites

2Ghz?! That ain't gonna happen. ;)

 

lolz.. indeed, it does seem a tad fantastical by todays standards. Unless theyve taped out some sublime silicon somewhere.. who knows :p

 

Benchmarks have come out for the 5870.... A step up from the 285 to be sure, and not disimilar in numbers to the dual gpu 295.

 

...Still cant crack crysis at 1600p maxed :p The X2 might?

 

I might have been too optimistic, but I thought this next gen of GPUs performance jump would have been a bit more dramatic than x2

 

Radeon_HD_5870_benchmark_02_large.jpg

 

mtfbwya

Link to comment
Share on other sites

Ah, the Zune HD has landed for US people only atm... It looks like a great PMP, and with Tegra under the hood has alot of potential yet to be unlocked as far as being a games machine like the touch.

 

Personally, Im still not sure I want one... if it was also a phone... yes.

 

We'll see what apps/games its gets before making a call on how it will fare on that front.

 

Here is engadgets review, plenty of pics and vids

 

zune_r_main.jpg

 

edit: It looks like it hasnt been added to the updates bit yet (cheeky negsun!) but the latest DirectX compile has been released a few weeks ago in early August. It will take Vista users to DX11 to be on par with Windows 7 RTM systems. Of course atm the Direct3D11 liraries will be doing jack all, but at least the DX10 updates may actually do some good with certain apps and games ;)

 

More info/Download the standalone DirectX August 2009 Redist Here [103MB]

 

mtfbwya

Link to comment
Share on other sites

The Radeon 5870 launches!!

 

and, yes, the performance on this thing is downright excellent. this card handily trounces the GTX 285 and will even hang with the 4870 X2 and the GTX 295, which isn't bad considering that its 1 GPU vs 2 GPUs.

 

the thing that really catches your eye, though, is how much tech AMD crammed into this thing. from what i can see, this card should be ample future-proofing if you're planning on going all out for DX11. downsides: this card is hot, literally. Tom's got that card to get all the way up to 100C during testing, and thats with the noisy fan cranking at full tilt. what doesn't really help matters is that half of the exhaust is vented right back into the computer via the top duct (since half of the back slot is now taken up by a DVI port).

 

the other bits of good news: lower prices on previous generation hardware that are sure to come. :D

Link to comment
Share on other sites

As predicted, it's twice as powerful as the 4870, but uses only 28 more watts at load and 62 less at idle.

 

Impressive; most impressive. :vadar:

 

Nvidia's got their work cut out for them; that's for sure. This is going to be an interesting generation of GPUs.

Link to comment
Share on other sites

I dont know... Im not as impressed by the 5870 Im afraid (Not that I could construct a better one of course :p) But 100C is downright ridiculous!

 

Lucky for AMD users they cant fold because that app would make these cards catch fire :p Even my multi GPU 295 barely goes into the 60Cs after prolonged gaming.

 

Indications are that the GT300 will perform similarly or pehaps slightly better - and reports that it will be out by December are making the rounds. Hopefully it will keep its cool better than than the AMD Heat Brick.

 

The fact that this upcoming generation of GPUs still wont have conquered the CryEngine2 is a tad disappointing, considering that CryEngine3 is just around the corner :p

 

I still cant get over that 100C.... I think I'd offically like to announce my retirement from the AMD fanboys club :p The Athlon/Cool 'n' quiet green computing days are well OVER.

 

 

mtfbwya

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...