darthfergie Posted February 7, 2002 Share Posted February 7, 2002 Just when you thought that you had the best piece of hardware on the block thanks to your new Ti500. You find out that you are now being downgraded. You are now obsolete once again. From Gamespy.com The leading graphics chip manufacturer has once again turned up the heat on the competition. NVIDIA has just announced their latest next-generation graphics chipset, the GeForce 4. The new chipset is going to come in two forms. For the high end power user (basically you who is reading this) is the GeForce 4 Ti 4600. And for the general consumer out there (everyone else) there is going to be the GeForce 4 MX 460. Successor to the GeForce 3, the GeForce 4 family of GPUs (Graphical Processing Units) contains a dedicated high-speed memory pipeline capable of calculating billions of operations per second. In addition, where current video cards rely on the CPU to handle several calculations, the GeForce 4 will now take over these functions which will allow the processor to concentrate more on game speed. The end result is incredibly realistic graphics - more so than what we've seen with the GeForce 3. Under the hood of the GeForce 4 is the nfiniteFX II engine. With built in support for dual vertex shaders, advanced pixel shader pipelines, 3D textures, shadow buffers and z-correct bump mapping, it is now possible to render incredibly complex objects at very fast framerates. Whereas before you had to cut back on your detail levels to maintain some level of performance - with the GeForce 4, gamers will be encouraged to crank up the detail level. When you put it all on paper, it's all very astounding. The nfiniteFX II engine's dual vertex shaders are able to drive more than 100 million vertices per second and the advanced pixel shaders give the GeForce 4 GPUs 50% more performance than the GeForce 3. Our benchmarks will show this later on this week. The new GeForce 4 is also going to take on the visual quality issues of on-screen aliasing. Commonly referred to as "jaggies," we all know that when we see them on screen, they are not only distracting but downright ugly as well. nVidia realizes that there are many anti-aliasing techniques available, but they are confident none is as sophisticated as what they have with the GeForce 4. Introducing the nVidia Accuview Anti-aliasing subsystem. "The introduction of the GeForce4 is the first time you can turn on full scene anti-aliasing and leave it on in high resolutions," explains Brian Burke, Senior PR Manager at nVidia, "if you can get 118 frames per second in Quake3 running at 1280x1024 with Quincunx AA on, you are going to leave it on." Think about it. Gamers can now choose high-resolution anti-aliasing as their default display mode, without suffering any performance hit. Totally rockin! However, as with everything in this industry, there is always a flip side. Remember when the GeForce 3 came out? Now how many games then (and now for that matter) really fully utilize the power of it? Of course it's nice to have all of that graphical horsepower in your PC, but when do you really think game developers will be able to churn out the games that will fully be optimized for the GeForce 4 technology? Not for a little while yet. But don't get us wrong, if you're a hardcore gamer that needs the latest and greatest thing, be sure to pick up one of these bad boys. There are a few titles in development that are going to be optimized for it, so you won't be left out in the cold. So how much can you expect to pay for one of these boards? NVIDIA doesn't set the prices for the graphics boards but it's safe to assume that the GeForce 4 Ti 4600 will fall in the enthusiast price range like the Ti500 did when it debuted (which was around $399). The GeForce 4 MX 460 will be priced for the mainstream segment (around $179). But to get exact board prices, you should check with the specific vendor. Link to comment Share on other sites More sharing options...
digl Posted February 7, 2002 Share Posted February 7, 2002 For the high end power user (basically you who is reading this) is the GeForce 4 Ti 4600. They should have said "for the high end power user with a lot of money to spend" Its good anyway, because the 8500 and the Ti500 prices will drop, and I should get one of those Link to comment Share on other sites More sharing options...
GUNNER Posted February 7, 2002 Share Posted February 7, 2002 Originally posted by digl Its good anyway, because the 8500 and the Ti500 prices will drop, and I should get one of those Exactly, time to get a Ti500, honey, where is the visa? Link to comment Share on other sites More sharing options...
OnlyOneCanoli Posted February 8, 2002 Share Posted February 8, 2002 I doubt I'll be getting a GF4. My GF3 will cut it for a while to come, most likely. And now developers are just starting to utilize the GF3's features. Or so I hear. And it's great for people who don't have a lot of money. They can get the GF3's or the new Radeon when they go down in price. Link to comment Share on other sites More sharing options...
Darth Simpson Posted February 8, 2002 Share Posted February 8, 2002 Initial benchmarks have been great, GF4 is a real powerhorse. I will wait for an eventual GeForce 5 though ( name change time... ). The GeForce 3 currently in my PC will still do for a while... Link to comment Share on other sites More sharing options...
Darth_Lando Posted February 8, 2002 Share Posted February 8, 2002 I just bought a Radeon 8500 and I would be crazy to go buy a GF4 now. There comes a point where the price/performance ratio becomes less desirable for me. The GF4 is a good card but I am waiting for the GF5 before I even think about upgrading again. And what the crap is up with the GF4mx? All the GF3 cards are directx 8.0 and the GF4mx is directx7????? Makes no sense to release a next generation card with two generation old tech! Link to comment Share on other sites More sharing options...
Guest toms Posted February 8, 2002 Share Posted February 8, 2002 there aren't really any new features on the gf4... it is just a highly optimised gf3... so if you have a gf3 at least you don't have to worry about not having any cool features... i think i'll go for a gf3 once the price drops... Link to comment Share on other sites More sharing options...
Millions o' Monkeys Posted February 8, 2002 Share Posted February 8, 2002 just 6 months extra work chucked on top Link to comment Share on other sites More sharing options...
digl Posted February 8, 2002 Share Posted February 8, 2002 Hey Lando and how is that 8500 working for you? are you happy with it? I should decide if I get an 8500 or a Ti500 Link to comment Share on other sites More sharing options...
Reddog Posted February 8, 2002 Share Posted February 8, 2002 GeForce 4 Ti4200 is only 200 bucks and will be available in April - best bang for the buck! Link to comment Share on other sites More sharing options...
Lord_FinnSon Posted February 8, 2002 Share Posted February 8, 2002 I thought about buying Radeon 8500 with 128mb. There aren't too many games yet, which would need so much memory, but buying this card is much wiser because of better drivers than those which came with original. Of course you could still get that one and download newest drivers, but I have a feeling that those extra 64mb will come handy quite soon. Link to comment Share on other sites More sharing options...
CaptainRAVE Posted February 9, 2002 Share Posted February 9, 2002 Heres a better preview from Gamespot UK..... Benchmarks show that some components of the new GeForce4 graphics chips run up to 115 percent faster than their predecessors, but not all systems will reap the benefits. Graphics acceleration enthusiasts can expect the new GeForce4 chip to deliver a significant performance boost over the fastest previous core from Nvidia, the GeForce3 Ti 500, with some components showing up to 115 percent improvement according to ZDNet tests. Generally, the increased performance will mean users can turn on more features, such as anti-aliasing and filtering, for any given frame rate. The fastest GeForce4, the Ti 4600, has six million more transistors than the Ti 500, bringing the total to 63 million -- more than Intel's flagship Pentium 4 processor. It includes a number of new features, including better anti-aliasing and the ability to drive up to 16 monitors, as well as generally better 3D performance. Rendering performance increases are largely due to doubling the amount of frame-buffer memory to 128MB, although the midrange GeForce cards will still use 64MB. Nvidia has also added a second vertex shader, a component for rendering light and shadow graduations. Under test conditions, the vertex shader component of the Ti 4600 showed a 115 percent improvement over the Ti 500. The 3DMark 2001 tests, carried out by ZDNet Germany, used an Athlon XP/2000+ with 256MB of DDR memory, running at a screen resolution of 1024 x 768 pixels in 32-bit colour. In tests, the Ti 4600 runs faster with Nvidia's basic level of anti-aliasing, called Quincunx, than the Ti 500 with no anti-aliasing. The Ti 4600 hit 8603 on 3DMark 2001 with Quincunx anti-aliasing, while the Ti 500 reached 8363 without anti-aliasing. Anti-aliasing is a technique for smoothing the jagged lines of computer graphics so that they appear more realistic. With increased image quality features, the Ti 4600 maintains its significant lead over the Ti 500. For example, with 4x anti-aliasing, the Ti 4600 reached 5678, topping the Ti 500 with Quincunx anti-aliasing, which hit 5312. The performance gap between the two chips particularly shows up with anti-aliasing turned on, according to benchmarks. With both chips running Quincunx anti-aliasing, the Ti 4600 hit 8603 in tests, while the Ti 500 reached 5312, a 62 percent difference. With 4x anti-aliasing the Ti 4600 is 58 percent faster, while with 4x anti-aliasing and 64-tap anisotropic filtering turned on, the Ti 4600 is 39 percent faster. Anisotropic filtering is a technique for rendering images more clearly. In Standard Mode, without anti-aliasing, Ti 4600 is 23 percent faster. The new cards won't come cheap, with a recommended price for the Ti 4600 of about £279, and some doubt whether there are any games at the moment that require so much power. Some analysts have said that Return to Castle Wolfenstein, from Id Software, could tax the top-of-the-line GeForce4 cards, and upcoming games like id's next version of Doom will be designed to take full advantage of the new technology. Nvidia says it ultimately wants to achieve the super-realistic effects of computer-animated films like Shrek on PC games. Link to comment Share on other sites More sharing options...
darthfergie Posted February 10, 2002 Author Share Posted February 10, 2002 Originally posted by Reddog GeForce 4 Ti4200 is only 200 bucks and will be available in April - best bang for the buck! It is already out...I went to Best Buy yesterday and saw it on the shelves for $179...needless to say I was shocked. (it was an Ectasy version for anyone interested) Link to comment Share on other sites More sharing options...
CaptainRAVE Posted February 10, 2002 Share Posted February 10, 2002 I feel their rushing things though. A new graphics card a year Link to comment Share on other sites More sharing options...
darthfergie Posted February 10, 2002 Author Share Posted February 10, 2002 If it works well then why not. They are rushing RAM/Processor/Speed/Games/Software/anything else dealing with computers So why not rush these too? As long as they are a good step up I am fine with it. Link to comment Share on other sites More sharing options...
CaptainRAVE Posted February 10, 2002 Share Posted February 10, 2002 As long as they dont do what 3Dfx did Link to comment Share on other sites More sharing options...
Darth Evad Posted February 10, 2002 Share Posted February 10, 2002 They should just make an AGP card that is upgradeable. A card where you could swap processors and swap RAM chips. Link to comment Share on other sites More sharing options...
Darth_Lando Posted February 12, 2002 Share Posted February 12, 2002 Originally posted by digl Hey Lando and how is that 8500 working for you? are you happy with it? I should decide if I get an 8500 or a Ti500 I am loving my R8500. Especially with the new drivers Ati released the other week (I haven't messed with any of the "leaked" ati drivers). Unfortunatly, all web site benchmarks on the R8500 use either the original cd drivers or the November release so it is next to impossible to see the speed the card is running out now. Obviously I prefer a R8500 over a GF3ti500 for several reasons: *it is a directx 8.1 card. *1.4 shaders *price/performance ratio (more bang for your buck, in some cases surpases the GF3ti500 altogether) *Carmack also stated in his Doom commentary The fragment level processing is clearly way better on the 8500 than on the Nvidia products, including the latest GF4. You have six individual textures, but you can access the textures twice, giving up to eleven possible texture accesses in a single pass, and the dependent texture operation is much more sensible. This wound up being a perfect fit for Doom, because the standard path could be implemented with six unique textures, but required one texture (a normalization cube map) to be accessed twice. The vast majority of Doom light / surface interaction rendering will be a single pass on the 8500, in contrast to two or three passes, depending on the number of color components in a light, for GF3/GF4 Now the GF4 is more spanks the R8500 no doubt, but between the R8500 and the GF3 family I prefer the R8500. But it really all boils down to is preference. The cards have both superb strong points and minor weaknesses that it really wouldn't matter which one you pick. Link to comment Share on other sites More sharing options...
CaptainRAVE Posted February 12, 2002 Share Posted February 12, 2002 GeForce4 Launched nVidia have decided to show suitors to the graphics throne, exactly who the boss is. For the first time, since 3DFX was around, nVidia seems worried. That much can be assumed from the extensive attempt to reach all market layers. Gone are the days when a major chip manufacturer could hold off the release of the cheaper models in order to target the hardcore gamers and their big bucks. So the three GeForce4 boards will be: the GeForce4 Ti 4600 and 4400, the GeForce4 MX 460, 440 and 420, and the mobile GeForce4 440 Go and 420 Go. The MX's are already shipping in OEM systems, the mobile chips will be available within February and the Ti's will hit stores in about 60 days. Now if you where an ATI exec and saw that coming at you wouldn't you be running for cover? nVidia are certainly pulling all the stops and have managed to secure support from all sectors of the industry. "Video games used to be a kind of step-child to film and live TV," said David DeMartini, Executive Producer for Golfing Simulations at Electronic Arts. "Today, by delivering high-resolution, high frame-rate, full scene antialiasing in a complete family of GPUs, nVidia has provided us with the tools to deliver breakthrough performance and image quality for the gaming industry. This helps Tiger Woods PGA Tour 2002 achieve a level of realism that rivals film and video and gives the player the sense they’re in the game." More support is evident in the overwhelming numbers of PC makers which have included the GeForce4 to their line-up, including Apple, Compaq, Gateway, HP, MicronPC and Toshiba. Board manufacturers include: ASUSTeK, eVGA.com, Gainward, Leadtek, MSI, PNY and Visiontek. Link to comment Share on other sites More sharing options...
darthfergie Posted February 15, 2002 Author Share Posted February 15, 2002 http://www.gamespy.com has benchmarked the new GeForce 4...check it out here http://www.gamespy.com/hardware/february02/geforce4/ Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.