Jump to content

Home

Are you ready for...*drumroll* GeForce4


darthfergie

Recommended Posts

Just when you thought that you had the best piece of hardware on the block thanks to your new Ti500. You find out that you are now being downgraded. You are now obsolete once again.

 

From Gamespy.com

The leading graphics chip manufacturer has once again turned up the heat on the competition. NVIDIA has just announced their latest next-generation graphics chipset, the GeForce 4. The new chipset is going to come in two forms. For the high end power user (basically you who is reading this) is the GeForce 4 Ti 4600. And for the general consumer out there (everyone else) there is going to be the GeForce 4 MX 460.

 

Successor to the GeForce 3, the GeForce 4 family of GPUs (Graphical Processing Units) contains a dedicated high-speed memory pipeline capable of calculating billions of operations per second. In addition, where current video cards rely on the CPU to handle several calculations, the GeForce 4 will now take over these functions which will allow the processor to concentrate more on game speed. The end result is incredibly realistic graphics - more so than what we've seen with the GeForce 3.

 

Under the hood of the GeForce 4 is the nfiniteFX II engine. With built in support for dual vertex shaders, advanced pixel shader pipelines, 3D textures, shadow buffers and z-correct bump mapping, it is now possible to render incredibly complex objects at very fast framerates. Whereas before you had to cut back on your detail levels to maintain some level of performance - with the GeForce 4, gamers will be encouraged to crank up the detail level.

 

When you put it all on paper, it's all very astounding. The nfiniteFX II engine's dual vertex shaders are able to drive more than 100 million vertices per second and the advanced pixel shaders give the GeForce 4 GPUs 50% more performance than the GeForce 3. Our benchmarks will show this later on this week.

 

The new GeForce 4 is also going to take on the visual quality issues of on-screen aliasing. Commonly referred to as "jaggies," we all know that when we see them on screen, they are not only distracting but downright ugly as well. nVidia realizes that there are many anti-aliasing techniques available, but they are confident none is as sophisticated as what they have with the GeForce 4.

 

Introducing the nVidia Accuview Anti-aliasing subsystem. "The introduction of the GeForce4 is the first time you can turn on full scene anti-aliasing and leave it on in high resolutions," explains Brian Burke, Senior PR Manager at nVidia, "if you can get 118 frames per second in Quake3 running at 1280x1024 with Quincunx AA on, you are going to leave it on." Think about it. Gamers can now choose high-resolution anti-aliasing as their default display mode, without suffering any performance hit. Totally rockin!

 

However, as with everything in this industry, there is always a flip side. Remember when the GeForce 3 came out? Now how many games then (and now for that matter) really fully utilize the power of it? Of course it's nice to have all of that graphical horsepower in your PC, but when do you really think game developers will be able to churn out the games that will fully be optimized for the GeForce 4 technology? Not for a little while yet. But don't get us wrong, if you're a hardcore gamer that needs the latest and greatest thing, be sure to pick up one of these bad boys. There are a few titles in development that are going to be optimized for it, so you won't be left out in the cold.

 

So how much can you expect to pay for one of these boards? NVIDIA doesn't set the prices for the graphics boards but it's safe to assume that the GeForce 4 Ti 4600 will fall in the enthusiast price range like the Ti500 did when it debuted (which was around $399). The GeForce 4 MX 460 will be priced for the mainstream segment (around $179). But to get exact board prices, you should check with the specific vendor.

Link to comment
Share on other sites

For the high end power user (basically you who is reading this) is the GeForce 4 Ti 4600.

They should have said "for the high end power user with a lot of money to spend" :mad:

 

Its good anyway, because the 8500 and the Ti500 prices will drop, and I should get one of those

Link to comment
Share on other sites

I just bought a Radeon 8500 and I would be crazy to go buy a GF4 now.

 

There comes a point where the price/performance ratio becomes less desirable for me.

 

The GF4 is a good card but I am waiting for the GF5 before I even think about upgrading again.

 

And what the crap is up with the GF4mx? All the GF3 cards are directx 8.0 and the GF4mx is directx7????? Makes no sense to release a next generation card with two generation old tech!

Link to comment
Share on other sites

there aren't really any new features on the gf4... it is just a highly optimised gf3... so if you have a gf3 at least you don't have to worry about not having any cool features...

 

i think i'll go for a gf3 once the price drops...

Link to comment
Share on other sites

I thought about buying Radeon 8500 with 128mb. There aren't too many games yet, which would need so much memory, but buying this card is much wiser because of better drivers than those which came with original. Of course you could still get that one and download newest drivers, but I have a feeling that those extra 64mb will come handy quite soon. ;)

Link to comment
Share on other sites

Heres a better preview from Gamespot UK.....

 

 

 

Benchmarks show that some components of the new GeForce4 graphics chips run up to 115 percent faster than their predecessors, but not all systems will reap the benefits.

 

Graphics acceleration enthusiasts can expect the new GeForce4 chip to deliver a significant performance boost over the fastest previous core from Nvidia, the GeForce3 Ti 500, with some components showing up to 115 percent improvement according to ZDNet tests.

 

Generally, the increased performance will mean users can turn on more features, such as anti-aliasing and filtering, for any given frame rate.

 

The fastest GeForce4, the Ti 4600, has six million more transistors than the Ti 500, bringing the total to 63 million -- more than Intel's flagship Pentium 4 processor. It includes a number of new features, including better anti-aliasing and the ability to drive up to 16 monitors, as well as generally better 3D performance.

 

Rendering performance increases are largely due to doubling the amount of frame-buffer memory to 128MB, although the midrange GeForce cards will still use 64MB. Nvidia has also added a second vertex shader, a component for rendering light and shadow graduations.

 

Under test conditions, the vertex shader component of the Ti 4600 showed a 115 percent improvement over the Ti 500. The 3DMark 2001 tests, carried out by ZDNet Germany, used an Athlon XP/2000+ with 256MB of DDR memory, running at a screen resolution of 1024 x 768 pixels in 32-bit colour.

 

In tests, the Ti 4600 runs faster with Nvidia's basic level of anti-aliasing, called Quincunx, than the Ti 500 with no anti-aliasing. The Ti 4600 hit 8603 on 3DMark 2001 with Quincunx anti-aliasing, while the Ti 500 reached 8363 without anti-aliasing. Anti-aliasing is a technique for smoothing the jagged lines of computer graphics so that they appear more realistic.

 

With increased image quality features, the Ti 4600 maintains its significant lead over the Ti 500. For example, with 4x anti-aliasing, the Ti 4600 reached 5678, topping the Ti 500 with Quincunx anti-aliasing, which hit 5312.

 

The performance gap between the two chips particularly shows up with anti-aliasing turned on, according to benchmarks. With both chips running Quincunx anti-aliasing, the Ti 4600 hit 8603 in tests, while the Ti 500 reached 5312, a 62 percent difference.

 

With 4x anti-aliasing the Ti 4600 is 58 percent faster, while with 4x anti-aliasing and 64-tap anisotropic filtering turned on, the Ti 4600 is 39 percent faster. Anisotropic filtering is a technique for rendering images more clearly.

 

In Standard Mode, without anti-aliasing, Ti 4600 is 23 percent faster.

 

The new cards won't come cheap, with a recommended price for the Ti 4600 of about £279, and some doubt whether there are any games at the moment that require so much power. Some analysts have said that Return to Castle Wolfenstein, from Id Software, could tax the top-of-the-line GeForce4 cards, and upcoming games like id's next version of Doom will be designed to take full advantage of the new technology.

 

Nvidia says it ultimately wants to achieve the super-realistic effects of computer-animated films like Shrek on PC games.

Link to comment
Share on other sites

Originally posted by Reddog

GeForce 4 Ti4200 is only 200 bucks and will be available in April - best bang for the buck!

 

It is already out...I went to Best Buy yesterday and saw it on the shelves for $179...needless to say I was shocked. (it was an Ectasy version for anyone interested)

Link to comment
Share on other sites

Originally posted by digl

Hey Lando and how is that 8500 working for you? are you happy with it?

I should decide if I get an 8500 or a Ti500

 

I am loving my R8500. Especially with the new drivers Ati released the other week (I haven't messed with any of the "leaked" ati drivers).

 

Unfortunatly, all web site benchmarks on the R8500 use either the original cd drivers or the November release so it is next to impossible to see the speed the card is running out now.

 

Obviously I prefer a R8500 over a GF3ti500 for several reasons:

 

*it is a directx 8.1 card.

*1.4 shaders

*price/performance ratio (more bang for your buck, in some cases surpases the GF3ti500 altogether)

*Carmack also stated in his Doom commentary

The fragment level processing is clearly way better on the 8500 than on the

Nvidia products, including the latest GF4. You have six individual textures,

but you can access the textures twice, giving up to eleven possible texture

accesses in a single pass, and the dependent texture operation is much

more

sensible. This wound up being a perfect fit for Doom, because the standard

path could be implemented with six unique textures, but required one texture

 

(a normalization cube map) to be accessed twice. The vast majority of

Doom

light / surface interaction rendering will be a single pass on the 8500, in

contrast to two or three passes, depending on the number of color

components

in a light, for GF3/GF4

 

Now the GF4 is more spanks the R8500 no doubt, but between the R8500 and the GF3 family I prefer the R8500.

 

But it really all boils down to is preference.

 

The cards have both superb strong points and minor weaknesses that it really wouldn't matter which one you pick.

Link to comment
Share on other sites

GeForce4 Launched

 

nVidia have decided to show suitors to the graphics throne, exactly who the boss is.

For the first time, since 3DFX was around, nVidia seems worried. That much can be assumed from the extensive attempt to reach all market layers. Gone are the days when a major chip manufacturer could hold off the release of the cheaper models in order to target the hardcore gamers and their big bucks.

 

So the three GeForce4 boards will be: the GeForce4 Ti 4600 and 4400, the GeForce4 MX 460, 440 and 420, and the mobile GeForce4 440 Go and 420 Go.

The MX's are already shipping in OEM systems, the mobile chips will be available within February and the Ti's will hit stores in about 60 days.

Now if you where an ATI exec and saw that coming at you wouldn't you be running for cover?

 

nVidia are certainly pulling all the stops and have managed to secure support from all sectors of the industry.

"Video games used to be a kind of step-child to film and live TV," said David DeMartini, Executive Producer for Golfing Simulations at Electronic Arts. "Today, by delivering high-resolution, high frame-rate, full scene antialiasing in a complete family of GPUs, nVidia has provided us with the tools to deliver breakthrough performance and image quality for the gaming industry. This helps Tiger Woods PGA Tour 2002 achieve a level of realism that rivals film and video and gives the player the sense they’re in the game."

 

More support is evident in the overwhelming numbers of PC makers which have included the GeForce4 to their line-up, including Apple, Compaq, Gateway, HP, MicronPC and Toshiba. Board manufacturers include: ASUSTeK, eVGA.com, Gainward, Leadtek, MSI, PNY and Visiontek.

 

 

 

:D

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...