Jump to content

Home

Tech News and Gossip thread...


Negative Sun

Recommended Posts

Instead of creating new threads every time AMD/ATi, nVidia or Intel try to blow us away with the latest technology, I thought we might as well keep it all in the same thread to keep it nice and tidy since most of these topics are related in some way or another...

My main sources are The Inquirer and CustomPC, but I don't check them all the time or feel like sharing info all the time (because it's not always worthy of its own thread lolz), but this thread might be a good way to keep each other up to date I believe.

 

I'll go first ;)

 

AMD loses $396 mill

^ Funniest part is this:

"Rival Intel made $1.86 billion in a simlar period, but still decided to cut costs by axing 2,000 staffers."

I am never buying another Intel product, no matter how much better their benchmarks might be...

 

But AMD isn't quite bankrupt yet...

^ A longer read, but interesting nonetheless, with a bit more info on the 45nm cores coming up...here's hoping they pull it off.

 

What's in a name for a GPU?

^ Ati's RV670 name game speculation (HD3700/3800???)

Link to comment
Share on other sites

  • Replies 835
  • Created
  • Last Reply

Lolz if you've spent some time around this forum lately you'll know that that won't be a problem for me :)

 

Just don't force me to make double or triple posts and at least nod in approval to my posts (or shake your head if you must, but since I'm always right I fail to see the point in that ;) )

Link to comment
Share on other sites

I hope Intel buys AMD.

That would only harm one group of people in the end: the consumer.

 

The only reason Intel has been so competitively priced and has brought out good value chips is because AMD is their main competitor, without that they'll become complacent and probably scr*w the costumer over even more.

*cough*look at M$*cough*

Link to comment
Share on other sites

Intel buy AMD... thats crazy talk !!

 

It'd more likely be a mega-conglomerate interested in entering the CPU market, like sony for example. Theyve always said theyre going to do it, Im not sure what they are currently up to....

 

I cant see it happening anytime soon though. Even if phenom doesnt take off for the desktop/gaming crowd, the server market, ATI products and project fusion will keep them around for a whiles yet.... For the same reasons as negsun, I'd never purchase an intel product again..... pull them out of pcs in the dumpster, yes :) (like the dual core pentium d powering my media center pc)

 

mtfbwya

Link to comment
Share on other sites

Samsung has been whispered as well, which would please me more than Sony cause Samsung have been putting high quality and value products out there for quite a while now...

Indeed :D Anyone catch this nice little tid-bit?

 

Samsung makes mega memory microchip

 

Samsung, the world's largest maker of computer memory chips, unveiled a 64-gigabit NAND flash memory chip based on finer process technology using circuit elements that are 30 nanometers wide.

High-density flash storage is gaining ground and becoming ever so closer :emodanc:

Link to comment
Share on other sites

Aye the SSD drives are supposed to be pwnage as they load games and apps in a flash (forgive the pun I just had to ;) )

 

I read that same article here Chainz, it's very interesting :)

 

So their co-operation or even purchase with/of AMD can only be a good thing in my book, give the whole Intel crowd a spank on the buttocks :)

Link to comment
Share on other sites

w00t a whitepaper .pdf file has leaked with Ati's upcoming HD3800 series and DX10.1 features detailed in it...grab it while it's here

 

The original article I found it on is here on the Inq site...

 

Tis exciting yesyes!

 

Edit:

To keep things balanced and unbiased, here's the (alleged) first review of the 8800GT!

Very good results I must say, and very consistent too, shame for all the GTS owners out there cause this puppy owns it lolz

Link to comment
Share on other sites

I wish there was a bit more to that review.. :p

 

Still, this is a perfect opportunity to start a rant about hardware manufacturers taking the enthusiast crowd for a ride.

 

I like tech as much as the next guy, but putting everything into context, and considering the prices of the higher end items, there are some very important provisos here..

 

I can say this specifically applies to CPU and GPU hardware in particular. People should think about...

 

1. What do you actually do with your computer ??

 

2. What monitor do you have, what is its max rez and output frequency

 

Im lucky enough to have a monitor that outputs at 1600p, but it outputs this at 60Hz. This means that anything I can EVER hope to do is capped at 60fps. At lower resolutions, it allows for a higher frequency and hence higher fps... Hence, I can play Bioshock, Oblivion etc at 1600p with maxed out settings and keep a steady and pretty 60fps with no problems with my 8800GTS. This was why I never forked out that extra $200+ for the GTX. Until a monitor that does 1600p at 120Hz comes out, I wouldnt bother with a higher end card.

 

Most monitors people have lurk around the 70-75Hz mark. Hence, when considering their purchases they need to keep this in mind. Do they need the GTX which gets 103fps for <insert game here> or could they get by with the GTS or ATI equivalent which does 82fps with all effects on ?? That 21fps reported by fraps etc doesnt make it to your eyes, so why should you bother ?? Ive always asked this of gaming tech-heads and have never gotten a response that makes any sense. It is difficult to talk about 'future proofing' for GPUs, with DX10 revisions, not many of the 'enthusiast' crowd are investing in a gfx card now that they can have in 3 years time.

 

There are a few 1080p screens out which operate at at 120Hz, but they are very expensive and actually oriented towards the home theater crowd atm. I havent seen a 1600p monitor that does 120Hz yet, though Im sure they arent too far away ;)

 

Quiet simply, the GPU manufacturers do a really great job at duping the higher end purchasers out of their cash - which is quite annoying :blast5:

 

This can even be argued with CPUs as well nowdays. With the constant revisions Intel and AMD are making, a 3 year old CPU will never perform as well as its contemporary.

 

For those working with applications that involve high end CPU and GPU demand, eg. graphical, HD video editing/compositing and animation apps, the argument for going high end is stronger - though these people will more likely have custom built(often mac based) pcs running a quadro card, simple as that.

 

So, at the end of it all, buying the highest end tech should be a purchase you should think about carefully. It's a high price to pay for bragging rights !

 

As for myself, the 8800GTS is handling itself OK at 1600p considering its 60fps cap... until a bunch of games come out that I really want to play actually challenge this, I now have the opportunity to squander cash on something other than my pc !! (actually saving for a holiday atm) :p

 

[/end rant]

 

mtfbwya

Link to comment
Share on other sites

That's totally true, I mean here I am playing Jade Empire and NWN2 on my OC'ed Barton 2600+ 2.1Ghz and GeForce 6800...All this stuff is like years old technology wise, but does it still do the trick? Hell yeah.

 

That's why even if I do get around to upgrading next time I'll probably be going for the X2 5000+ Black Edition and a cheapo X1950Pro, why? Cause that's all I'll ever need, my monitor atm is 16", when I ever upgrade to a new monitor the max I'd go for is probably a 19" or 22" widescreen one...which caps at about 1680x1050

 

Consider the fact that you'll never see me playing Oblivion or FEAR or STALKER or even Crysis (not a huge FPS fan, though I might give Crysis a go at the lowest setting in the future), I don't really need much more than that.

 

It's the same with overclockers these days, they buy a rediculously expensive chip to OC it even more wheras the whole point of it (IMO) is to buy a cheap chip and pwn people who paid lots more for the same performance. [/rant #2] :p

Link to comment
Share on other sites

That and our critical flicker frequency (where we see something as a continual light instead of as a series of flashes) is maxed out for most people below 50 Hz, unless you have an extremely bright big spot of light. Some people can go up to 65 Hz (and a few with super-eyes may perceive higher frequencies), but most of us looking at a monitor won't perceive anything different past 60 Hz anyway. Now, if you have a game that won't keep the fps up around 60 unless you have a humongous graphic card, then it might be worth it.

 

All you never wanted to know about critical flicker frequency.

Link to comment
Share on other sites

That and our critical flicker frequency (where we see something as a continual light instead of as a series of flashes) is maxed out for most people below 50 Hz, unless you have an extremely bright big spot of light. Some people can go up to 65 Hz (and a few with super-eyes may perceive higher frequencies), but most of us looking at a monitor won't perceive anything different past 60 Hz anyway. Now, if you have a game that won't keep the fps up around 60 unless you have a humongous graphic card, then it might be worth it.

 

All you never wanted to know about critical flicker frequency.

 

Thanks Jae. I was going to go digging for a similar article. 60 is indeed the magic number it seems. There are some who state aiming for 80fps in games as a top end is good as it it means your lower end or mean fps can sit around 60... but this is very variable depending on which game, which settings and what hardware.

 

I wonder about those 120Mhz home theater TVs. Not even the highest definition video formats clock in at 120fps! Heck, blu-ray doesnt even go anywhere near it...

 

I smell a sales gimmick aimed at the cashed up home theater crowd :(

 

mtfbwya

Link to comment
Share on other sites

I wonder about those 120Mhz home theater TVs. Not even the highest definition video formats clock in at 120fps! Heck, blu-ray doesnt even go anywhere near it...

 

I smell a sales gimmick aimed at the cashed up home theater crowd :(

 

mtfbwya

 

Well, the general perception is bigger/faster/stronger/whatever is better (I smell a Ray comment coming.... :D ), but I bet most people have never heard of critical flicker frequency, much less know that there's actually a cap to what the eye can perceive. So, I can't _entirely_ blame the sales force. My guess is that most companies don't have vision specialists on staff to point these things out, and with the drive to improve fps rates, even if some specialist did point that out, Joe Public wants the higher fps even though it doesn't make a difference. The companies risk being left behind if they don't keep up with what the market wants or perceives it wants.

Link to comment
Share on other sites

When I was working in optics I found it funny that people spent loads on a fancy HDTV but didn't care to buy glasses when their distance VA was below par lolz...You just spent thousands on a TV but your sight isn't even sharp enough to make out the difference with your old one!

 

*sigh* people, whatcha gonna do with em?

Link to comment
Share on other sites

Well, Here's a decent review of the 8800GT.

 

Once all of the price-gouging dust settles after the initial launch it should be a good deal. Hell, it outperforms the GTS (640MB) for less money (even with the blatant price-gouging!) and gets quite close to GTX levels in some games. It also uses less power than either one. It also has all of the video encode/decode features of the 8600xx cards, which both the GTS and GTX lack.

 

I can't wait to see how the HD3800 stacks up to it.

Link to comment
Share on other sites

I'm happy to see that the 8800 GT made progress in reduction of power consumption, probably due almost entirely to the shrink to 65 nm process. I'm also pleased that NVIDIA is also supporting more video decoding formats on the hardware side, like the lower-end 8xxx cards do. After buying my 7800 GTX back in 2005 I learned that it's often times better to wait for the new mid-range card to show up (7950 GT) as that often has better performance/price ratio. The 8800 GT confirms my theory as Anandtech says the 8800 GT should be priced in the USD$200-250 range and that is my sweet spot for a graphics card price point.

 

Is AMD's new RV670 chip going to be produced as the HD3800?

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.


×
×
  • Create New...