Jump to content

Home

Tech News and Gossip thread...


Negative Sun

Recommended Posts

Can't AMD and NVIDIA still implement support for their hybrid technologies in their drivers? I like the hybrid concept but I don't want to have to reboot my PC to switch between graphics cards. Unless the switch between the IGP and the discrete card can happen dynamically in Windows I don't think I'll have much interest in the tech.

Link to comment
Share on other sites

  • Replies 835
  • Created
  • Last Reply

An extensive review of the Core i7 plus OC performance etc...

 

Its pwns allright, AMD better have a couple of aces up its sleeve because Intel is wiping the floor with anything else CPU out there.

 

I still thinks Apple should sue Intel, and don't get me started on M$ !!!

I'm sure if they could have got away with calling it Windows i7 they would have...twats!

Link to comment
Share on other sites

Everything I'm reading about Core i7 (Nehelem) tells me that it's a rather meh improvement over Yorkfield in everything but highly multithreaded apps. This is due to the small L2 cache. In multithreaded stuff it screams, however. Right now it just doesn't seem to be worth the cost of a very expensive motherboard and DDR3 unless you do a lot of video encoding.

 

What I'm really interested in is AMD's Deneb and how it stacks up to Yorkfield. As long as it can compete with or even beat Yorkfield at a good price, AMD will have a winner on their hands. I think it can do it.

Link to comment
Share on other sites

But Q! Im hardly teh one to sing Intels praises at any time but those 3DMark CPU numbers dont lie... look where top of the line phenom quaddie sits!! Back when the x2 4800+ was king of the heap, cost $500(when I bought it!), others were also saying "meh, but ya dont really need it unless".

 

As GPUs, SDK and devs mature, the ability to pulverize numbers like that will be comin in handy I can safely guarantee. Not that Im going anywhere near one, but anyone that does get a i7 is CPU-future proofing themselves for a couple of years.

 

As Im stuck with AM2, Im waiting to see what the heck shanghai will do for me :D STill, my first priority is a new GPU. After much to and fro, Im sticking it out and waiting to catch the 55nm tidal wave ;) The power efficiency phreak in me demands it :p

 

mtfbwya

Link to comment
Share on other sites

3DMARK scores are not indicative of real-world performance. If the most demanding thing you do with your PC is game, then it would not be worth the cost to upgrade from Wolfdale/Yorkfield to Core i7, IMO.

 

And as far as the AMD crowd is concerned: it would be best to wait until we get some solid numbers concerning Deneb before you do anything. A platform migration to the hated evil empire of Intel may not be necessary, especially if you already own an AM2 board that can support Deneb. Hurry up and wait. ;)

Link to comment
Share on other sites

... A platform migration ...

 

ha! thatd be the day! The lynchpin for gaming performance, was and still is the GPU. Id rather upgrade that of course!

 

Getting an i7 and keeping an <insert any currently available GPU> will not give you that magical 60fps average on any high end game at WQXGA(1600p) ;)

 

mtfbwya

Link to comment
Share on other sites

  • 3 weeks later...

So I saw this article on DailyTech, "WARP Runs Direct3D 10/10.1 on CPU with Windows 7" and wondered what WARP was. Turns out it is Microsoft giving the CPU the ability to render graphics. I'm not sure what to make of this as of yet but in the MSDN article DailyTech linked to I found these statements of interest.

Enabling Rendering When Direct3D 10 Hardware is Not Available

 

WARP allows fast rendering in a variety of situations where hardware implementations are unavailable, including:

 

* When the user does not have any Direct3D capable hardware

* When running as a service or in a server environment

* When no video card installed

* When a video driver is not available, or is not working correctly

* When a video card is out of memory, hangs or would take too many system resources to initialize.

I like the idea of my system not being totally useless if my graphics card goes out, if that is what this means.

We don’t see WARP10 as a replacement for graphics hardware, particularly as reasonably performing low end Direct3D 10 discrete hardware is now available for under $25. The goal of WARP10 was to allow applications to target Direct3D 10 level hardware without having significantly different code paths or testing requirements when running on hardware or when running in software.
I think I require more knowledge of programming than what I possess to understand the full implications of this statement. But yeah, Microsoft apparently isn't positioning this as a potential replacement to discrete graphics solutions. Or are they? :D
Link to comment
Share on other sites

Char, MS are implementing WARP to avoid the whole "capable" issue that they had with the "windows vista capable" certification. A huge complaint of many with lower end systems was how Vista Aero bumps up the sysreqs. To make W7 more friendly to the early adopter, CPU rasterisation is touted as a solution to this.

 

As a htpc user/builder, I wonder how far WARP will go. The idea of a rig that only needs a CPU is tantalising indeed for those of us on a quest to build ultra small form factor htpcs.

 

What was cool was the benchie they ran: WARP on corei7 and no GPU performed better running Crysis(lolz) than any of the current intel integrated chipsets :p (A blistering 5-7FPS, but still better than its integrated cousins.)

 

Maybe this explains why W7 is ditching hybrid support for ATI/nvidia, they are essentially introducing a cpu-gpu variant of this type of tech. I imagine mixing the two together would lead to compatibility nightmares :p

 

* * *

 

In other news:

 

For those of you intent on whining about how bluray suxz, check out Pioneers 400GB 16 layer BR disc Compatible with existing players no less. Due by 2010. If youre willing to wait til 2013, 1TB discs are on their way. Fancy that ;) Now gaming devs have no excuse to go all out when putting together high end games and chucking them on one disc

 

mtfbwya

Link to comment
Share on other sites

yup, itd be great if WARP could take advantage of multicore platforms to boost graphics performance. ;)

 

In other news, looks like AMD in FINALLY, ALMOST ready to roll with the Phenom IIs - the one Ive got my eye on is the AM2 "940" which is x4 3.0GHz. Price is slated at $275

 

Now, gimme some benchies ! I know that it wont demolish Corei7, but as long as it makes a good showing, Im up for it > more power efficient and four cores to crunch all that folding and encoding I do.. cmon team AMD ![/somewhat fanboi]

 

mtfbwya

Link to comment
Share on other sites

It's because the 3.0GHz AM3 version, the 945, will be models of a later stepping. I will be the C3 stepping instead of C2 like the 940 AM2+ version, which is why it's coming out later (it's coming out in Q2 2009).

Link to comment
Share on other sites

@Q > what do you predict Phenom IIs performance as being like?

 

How it stands up to any intel product is irrelevant to me > as long as its *at least* like a double version of my x2 6000+ and power/heat efficient, then I'll be more than chuffed to be throwing some $$ at another AMD CPU(finally), and saving the real dough for the 55nm nvGPUs..

 

mtfbwya

Link to comment
Share on other sites

I think that it should at least equal Kentsfield so it would provide about a 25% IPC advantage over your 6000+ in single-threaded apps if it is. It may even equal Yorkfield and if that is the case then you'd be looking at 30%+. With all of the CPU-intensive stuff that you do it would be a worthwhile upgrade. The 3GHZ model should sell for less than $300.

Link to comment
Share on other sites

Maybe this one :D

 

GTX 260 GX2 aka GTX 295

 

check out the numbers(quoted from linkie)

 

GeForce GTX295 VGA card will feature two GPUs of 55nm GT200. And the number of stream processors is 480 (240×2), not 216×2 as we reported. Besides, its memory bus width is 896 bit (448bitx2), and memory size is 1792MB DDR3 (896MBx2). It won’t adopt GDDR5, and its total power is 289W. GTX 295 will be launched on Jan 8th...

 

Still, Ive always been weary of GX2s.... lets see what the 1600p benchies say before I get any further excited. But I have to admit, GX2 has begun to sound more funtastic when you think of them from a folder's perspective ;)

 

Looks like the 4870X2 has a ninja fight on its hands

 

mtfbwya

Link to comment
Share on other sites

So that picture Expreview is showing in that article isn't of the GTX 295, right? I mean, it can't be because I only see one GPU package on it.

 

No doubt such a card would be able to crunch through WU's like crazy. I always wondered how multiple GPU clients are configured to run on multi-GPU systems.

Link to comment
Share on other sites

No doubt such a card would be able to crunch through WU's like crazy. I always wondered how multiple GPU clients are configured to run on multi-GPU systems.

 

Theres plenty on it at the folding forums... can you imagine if chainz had a GX2 of his 280... almost 100,000 points per week > DAMN!

 

mtfbwya

Link to comment
Share on other sites

I should have also mentioned that I think that it's really cool that AMD is finally coming through on their promise to deliver a truly worthwhile upgrade for all of those people who purchased Socket AM2 motherboards.

 

If all of the buzz that I am hearing is true, then Phenom II should be a good overclocker as well, which is good news for those who want to get the best bang for their buck. I'm hearing 4GHZ+ on air, and I hope that it's true.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...