Jump to content

Home

ATI Radeon 9700 & 9000


ZeroXcape

Recommended Posts

Thoughts?

 

AnandTech 9700

 

AnandTech 9000

 

GameSpy Interview

 

In Q3A 1600x1200, 37% better performance then a GeForce4 Ti 6400.

 

In Unreal Tourney 2003 #1 1600x1200, 54% better performance then a GeForce4 Ti 6400.

 

In Unreal Tourney 2003 #2 1600x1200, 27% better performance then a GeForce4 Ti 6400.

 

Those are some pretty sweet improvements to me ;)

Link to comment
Share on other sites

Quake 3 engine games:

 

640x480x16 FOV: 120

 

/cg_fov 120

 

16 bit color in Q3 engines with Nvidia cards = very high FPS with almost no noticeable visual changes.

 

Other games (depending on the engine):

 

640x480x32 FOV: 120

 

/whatever command changes it to 120

 

Notice - for both the above, keep the texture at 32 bit, it's only the color depth that matters.

 

I own a P4 2.4, 1 gig DDR, GF 4 4600 and SB Audigy with DSL 1.5.

 

I run my games with those settings .

 

I see more than most people (due to FOV 120) with 'higher resolutions' and I always get extremely high FPS - which means fast gameplay, and super accurate aim.

 

I can set all other settings to max with no slow-down.

 

FOV = Field Of View - Default 90 or less in most games.

 

Do this and you won't worry about upgrading too much.

Link to comment
Share on other sites

Newer Video cards GPU's dont support 640 x 480. At that resolution your running the game solely off your CPU. That's why CPU benchmarks are done in 640 x 480 or less. If your not going to play in 800 x 600 res at minimum, a GF4 is a moot point anyway.

Link to comment
Share on other sites

That is incorrect.

 

What they are telling you is it isn't using 'a lot' of the GPU for processing (it's still being used just not highly taxed like it would at higher resolutions), but the polygons and special pixel effects are still being generated by the video card - for instance a game with pixel shaders at 640x480 will NOT run without a GF3 or higher. Why? Because it still needs the video card for the 'special features'.

 

648x480 for CPU tests is used because you aren't 'taxing the GPU' therefore you can get a close approximation of what is going on with the 'CPU'.

 

Otherwise anyone could use just about any 3D card at 640x480 - which isn't possible.

 

How else does it get to your screen?

 

FYI - Carmack mentions running Doom 3 at 640x480 on some cards. The X-Box runs at a resolution lower than 640x480 and uses a GF 4 GPU.

 

Also, notice why they use 640x480 here:

 

"To round things off we'll conclude with a bit of gaming performance using Quake III Arena, Return to Castle Wolfeinstein and Serious Sam. All of the games were run at 640 x 480 to prevent being limited by the graphics cards although with a GeForce3 Ti 500 (and especially its successor) these games will be almost as CPU dependent at 1024 x 768 as they are at 640 x 480."

 

http://www.anandtech.com/showdoc.html?i=1574&p=12

 

Prevent being limited by the graphics card ... That means they are trying to not tax it and rely upon it's performance as much.

 

More here:

 

http://www.lucasforums.com/showthread.php?s=&threadid=70114

Link to comment
Share on other sites

Okay I gues your right then, I just cant see the logic in running all the "special options" of a high dollar vid card at 640 by 480, putting the major load on the cpu and not effectually using your vid cards memory and full capabilities. I know for a FACT, the game runs best on my machine (1.4 ghz cpu and GF4 Ti 4200)

at 1024 x 768. That where I get the best video quality and frame rate.

 

But to each their own....

 

 

Kaan

Link to comment
Share on other sites

Uh.. you still don't get it.

 

You are STILL using the full capabilities of the video card.

 

It IS NOT using more CPU at a lower resolution.

 

It IS NOT using more video card at a lower resolution.

 

If you use a lower resolution, the video card isn't being used so hard so it doesn't fluctuate the benchmark with its own peformance limits.

 

There is no way your video card is running better at a higher resolution - change the max FPS to something higher and resolution to 640x480 and you will see.

 

Try this:

 

/cg_drawfps 1

 

/com_maxfps 120

 

Now try 640x480.

 

Wow, higher FPS!

 

Try this:

 

/cg_fov 120

 

Wow, you see more than you did at a higher resolution!

 

Do you understand now? :rolleyes:

Link to comment
Share on other sites

Originally posted by ZeroXcape

Thoughts?

In Unreal Tourney 2003 #1 1600x1200, 54% better performance then a GeForce4 Ti 6400.

 

In Unreal Tourney 2003 #2 1600x1200, 27% better performance then a GeForce4 Ti 6400.

 

Those are some pretty sweet improvements to me ;)

 

Has n1 seen the box of the 3dprophet by Hercules?

 

it says that it beats GF2 MX, by 71% in q3 1024x 768..

 

So my cousin decides to buy this, because I upgrade his pc, and he needed a cheap videocard.. so we buy it... and it's crap.. . It lagged like crazy in JO, and EVEN CRASIER ( i dont know if this a word or not) in WC3... Then we when back, paid 10 bux more for a GF2 MX 200, and It runs Almost as good as my geforce 3... at 800x600

 

I dont know about that guys.. but about those things "IMPROVING" in games, i think is a bunch of Bullsh!t... Either they run the game with their card, at a much better pc. Or is a bunch of BS...

Link to comment
Share on other sites

Originally posted by Absurd

Uh.. you still don't get it.

 

You are STILL using the full capabilities of the video card.

 

It IS NOT using more CPU at a lower resolution.

 

It IS NOT using more video card at a lower resolution.

 

If you use a lower resolution, the video card isn't being used so hard so it doesn't fluctuate the benchmark with its own peformance limits.

 

There is no way your video card is running better at a higher resolution - change the max FPS to something higher and resolution to 640x480 and you will see.

 

Try this:

 

/cg_drawfps 1

 

/com_maxfps 120

 

Now try 640x480.

 

Wow, higher FPS!

 

Try this:

 

/cg_fov 120

 

Wow, you see more than you did at a higher resolution!

 

Do you understand now? :rolleyes:

 

No YOU dont get it.

 

Being a veteran of FLIGHT SIMS which are much more taxing than FPS games, I am fully aware of what my computer does on what settings.

 

I don't get any performance increase at 640 x 480 and games look horrible in that resolution. I bought a GF4 for a reason....to use it to its capabilities.

 

As far as using /cg_fov, all that does is increase the field of view by moving the camera farther back in the third person view of your player model allowing you to see more, but that works in any resolution.

 

*shrugs*

 

I'll stick with 1024 x 768, everything maxed and 75 fps.

Link to comment
Share on other sites

FOV stands for field of view.

 

Default of most FOV's FPS games is around 70-90, based at the view of the camera at the character. Changing the FOV upward will distort the view into a fishbowl effect, as more of the view is put onscreen. This is how the effect was achieved in Alien Vs Predator, for the Aliens view. Personally, I find it annoying as hell, because of the distortion.

Link to comment
Share on other sites

Originally posted by Darth Kaan

 

No YOU dont get it.

 

Being a veteran of FLIGHT SIMS which are much more taxing than FPS games, I am fully aware of what my computer does on what settings.

 

I don't get any performance increase at 640 x 480 and games look horrible in that resolution. I bought a GF4 for a reason....to use it to its capabilities.

 

As far as using /cg_fov, all that does is increase the field of view by moving the camera farther back in the third person view of your player model allowing you to see more, but that works in any resolution.

 

*shrugs*

 

I'll stick with 1024 x 768, everything maxed and 75 fps.

 

Field Of View gives you much higher visual range than any resolution can do.

 

And if you don't get any performance increase at 640x480, than somewhere in your setup your CPU is being bottlenecked.

Link to comment
Share on other sites

Originally posted by HellFyre69

 

Has n1 seen the box of the 3dprophet by Hercules?

 

it says that it beats GF2 MX, by 71% in q3 1024x 768..

 

So my cousin decides to buy this, because I upgrade his pc, and he needed a cheap videocard.. so we buy it... and it's crap.. . It lagged like crazy in JO, and EVEN CRASIER ( i dont know if this a word or not) in WC3... Then we when back, paid 10 bux more for a GF2 MX 200, and It runs Almost as good as my geforce 3... at 800x600

 

I dont know about that guys.. but about those things "IMPROVING" in games, i think is a bunch of Bullsh!t... Either they run the game with their card, at a much better pc. Or is a bunch of BS...

 

Do you mean the Kyro 2?

 

If so, it uses Tile Based Rendering, which is useless for most Q3 engine games where Carmack has perfected in code Visual Surface Determination.

Link to comment
Share on other sites

Absurd: the first rule when you talk about something, is actually knowing what the hell you are talking about. Sorry to sound so blunt, but you show your knowledge is quite limited, so try not to give advice to others, you are just giving uninformed, misleading advices.

 

I am not the most knowledgeable person around when it comes to HW, but almost anyone that reads reviews and likes being informed knows these basic things:

 

- 640x480 is a crappy resolution by today's standards. Period. The only reason you see it listed on benchmarks is because games are not fillrate limited at that res, and then you can compare different CPUs and see how they scale. Even with a 15" monitor, 800x600 is the absolute minimum I'd use to play a game, given the HUGE visual quality difference over 640x480. For a 17" monitor, 1024x768 is usually the best resolution, unless you have a kickass sytem that allows you to play 1280x1024 with similar framerates.

 

-16 bit color is NOT anywhere near the quality of 32 bit color. Period. You say that Nvidia cards have the same quality of rendering at 16 bit than 32 bit? Nope. The only card that looks pretty well with 16 color is the kyro II, that renders internally at true color and only changes to 16 bit color when it writes the frame to the frame buffer, of course at the expense of not being any faster than 32 bit. Do you want to see the difference that color depth makes when it comes to realistic 3D rendering? Take a look at this picture: compares a 32 bit color render with a 128 bit color render..

 

http://www.tomshardware.com/graphic/02q3/020718/radeon9700-06.html#floating_point_precision_color

 

So, if 32 bit color is limited when it comes to realistic lighting effects, think of the shortcomings of 16 bit color.

 

- If you have a P4 2.4, 1 Gb of RAM and a Geforce 4 Ti4600 to run games at 640x480 16 bit, then I am sorry to tell you that you have spent 10x more than you needed. A simple duron with 64 Mb and a geforce2 MX would have been more than enough to give you 60+ fps. You don't belive me? Fine:

 

http://www.anandtech.com/showdoc.html?i=1608&p=13

 

A Geforce 2 MX coupled with an almighty duron :p delivers 68 fps @1024x768(32 bit, High quality settings).

 

With a system like yours, you could have it running at 1600x1200(32 bit) with HQ settings, smooth framerate, etc..

 

-FOV: if you change the normal setting to a higher value, you are just making it look like you are seeing things through a fisheye lens. You see more, but you see things in a very unnatural way, with figures shrinking very rapidly as they move away from you. I don't know about you, but I prefer seeing less field of view but in a way that resembles the human eye, not a fish'.

 

 

And about the kyro II, you are dead wrong that TBR is useless with Quake 3 engines. Good software based VSD techniques (z-buffering, BSP trees... ) help immediate render cards (all but powerVR cards) to render a scene, but don't negate the benefits of using TBR at all. So, according to your reasoning ATI's occlusion culling technology (hyper-z) or Nvidia's Lightning Memory architecture (techniques borrowed from the deferred rendering approach) are useless? Yeah right.

 

Anyway, I get stable 85 fps with a kyro II @1024x768 with all settings at maximum on most games using the Q3 engine. IF the 3D-Prophet that Hellfyre was talking about WAS a 3D-Prophet 4500 (all hercules video cards are 3D-Prophet(chip model) ), then he probably hadn't done enough research before buying the card, and didn't have a powerful enough CPU to use that card. Kyro II cards don't have T&L units, so they use the CPU to perform all vertex operations prior to rendering the pixels, and thus you need a powerful CPU to take advantage of it. A duron 800 is about the minimum recommended for that card. With an Athlon 1200 (266) you take full advantage of it, and have a huge edge over any Geforce 2 MX. Still I wouldn't recommend either card as a new buy even for an entry-level system, simply because there are much more powerful cards on the market for a few more bucks (Geforce 4 MX come to mind, the upcoming Radeon 9000 etc..).

 

So, to recap: don't open your mouth if you don't know what you are talking about, giving misleading advices is cheap for you but can be very costly for people that follow them.

 

Hellfyre (and any other in similar situations), just as you wouldn't buy a car without knowing its basic specs. (fuel comsumption, horsepower, etc..) you don't buy a new video card without knowing its minimum requirements & reading some reviews that give a good idea as to what its performance is like.

Link to comment
Share on other sites

Why are you comparing a 64 to a 128 bit image? Uhm, last time I checked max bits are 32 in most games ....

 

You obviously haven't read much about the Quake 3 engine:

 

#1 16 bit color rendering is MUCH faster and is a barely noticeable visual difference in twitch type games - you literally have to be standing and staring at a corner to notice (or stare at smoke going by):

 

16 bit color:

 

http://ucguides.savagehelp.com/Quake3/images/visual4_l.jpg

 

32 bit color:

 

http://ucguides.savagehelp.com/Quake3/images/visual3_l.jpg

 

Do you see a big difference?

 

Benchmark:

 

http://www.anandtech.com/showdoc.html?i=1435&p=17

 

#2 VSD makes enough of a difference where hardware occlusion culling isn't a large impact - especially with the Quake 3 engine - raw speed is what counts:

 

http://www.bluesnews.com/abrash/chap64.shtml

 

Hardware occlusion culling only really helps when software is badly programmed and needs it.

 

And I'm sorry, but if you have that setup and are playing JK 2 online, you are not getting a consistently high FPS like you state at max settings. It just won't happen, especially rendering outdoors scenes with about 16 players.

 

Anyhow, this is my last comment to you.

 

Instead of making a case you chose to flame - which ends the conversation after my first response.

Link to comment
Share on other sites

And I bet you haven't read/undestood what I posted, let alone what YOU posted ^^ :cool:

 

You just selected a couple pictures rendered with a Kyro II, which I already posted above ^^ it's probably one of the best looking when using 16 bits, because IT RENDERS EVERYTHING INTERNALLY IN 32 BIT. Just read the paragraph just above the pics:

 

"As the screen shots below show, full 32-bit still looks better since the 16-bit image is dithered down from the internal 32-bit. This is especially apparent where fine gradients in color appear on screen. Note that there is a significant reduction in dithering for the 16-bit image of the Kyro compared to most cards 16-bit rendering."

 

And I can corroborate that. A kyro II works and looks almost the same in 16 or 32 bit colors (almost no performance gain by using 16 bit, though it looks almost as good). When I see the difference on my old Nvidia card, it runs almost a 40% faster but looks quite ****ty in comparison. But if you have a new Geforce 4 Ti4600 like you do, then you are wasting away its huge horsepower rendering in 16 bit color, not to talk about setting the display at 640x480. Hey, if you do things like that, you'll probably have a ferrari to go to work, though you probably drive it in 1st gear, who cares if the car offers 5 more ;)

 

As for the benchmarks, the say exactly what I told you.. I cannot believe you didn't read/undestand the comments on the page you just posted:

 

"As a result of the Kyro II's internal 32-bit rendering, we were not surprised to find that this card actually had the least performance increase when going from 32-bit to 16-bit. Doing this comparison, however, is a bit unfair, since the 16-bit quality of the Kyro II is noticeably better than the 16-bit quality of competing cards. It is unfortunate, however, that almost no speed can be gained by falling back on 16-bit color mode"

 

It may all come up to the point that you actually think having an ungodly number like 280 on your framerate counter makes the game any cooler. I am sorry to tell you that you are not seeing the same number of fps as it says there. You are limited by the refresh rate of your monitor. Even in crappy 640x480, your monitor may be limited to 120 hz, so anything above that is just dreamland. Anyway, your eyes can't tell the difference between 60 fps and 120 fps, (let alone 280 if you could EVER see a display that could offer such ungodly refresh rates). If watching that makes you feel cool, cheers. The rest of us mortals, will be quite content with seeing that framecounter never dip below 60 fps, while enjoying maximum detail settings (yes, MAXIMUM ;) ) in 1024 or 1280 resolution in 32 bit color, paying 1/4 of what you paid.

 

Hardware occlusion culling only helps when software is badly programmed and needs it.

 

Hmmm, I may have missed that part, though I am quite certain that I never saw it. Anyway, I'll just send a few quick e-mails to Nvidia and ATI so they can save a few million transistors in their new parts, since they are investing millions of dollars in R&D to perfect their hardware occlusion culling technologies. If you want to know just how useless HW occlusion is read this:

 

http://www.anandtech.com/video/showdoc.html?i=1656&p=13

 

------------------------------------------------

 

In this case, this is very appropriate:

 

"Better to remain silent and be thought of as an idiot...than to speak up and remove all doubt..."

Link to comment
Share on other sites

The pictures I posted were not from a Kyro 2.

 

High Peak FPS = overall consistent FPS under rigorous situations (16 players, outdoor areas) = less fluctuation = smoother gameplay.

 

16 bit color is an advantage with Nvidia based cards, for performance, and the visual quality difference is minimal - as can be seen by the benchmark and pictures linked (notice the links are to different sites).

Link to comment
Share on other sites

Originally posted by Chastan

I can maintain 115FPS with max settings at 1024x768 with my Geforce 4. (And yes that's a steady 115FPS). No need to run 640x480, or 16-bit color. Oh yeah, I set my FOV at 100.

 

16 players MP, at outdoor areas?

Link to comment
Share on other sites

YAY! We're off topic!

 

Welp, I'm excited for ATI. They've finally put out a product that is a generation ahead of their competitors. But it also comes down to drivers. NVidia has pefected the art, ATI has not. A card with worthless, no matter the specs, if it has poor drivers. Only the retail version and a few months after will tell.

 

As for AnandTech's benchmarks, it is incredible that the 9700 pulls that far ahead than the GF4. NVidia has to be trembling in their boots. As for their Quake tests........bleh. Quake3 is an old engine and really doesn't do anything to test a vid card except for AA. Now, if they benchmarked with Tribes2, that would be something different.........

 

All in all, ATI's outdone NVidia, for now. If ATI can keep ahead, even by a little bit, then the title King goes to them (not just because of one release of one card).

Link to comment
Share on other sites

ATI has already lost. Look at how many have already went out and bought GF4's - all but the ATI fans, those who have a GF3, and those few who chose to hold off upgrading. Sure, ATI has a remarkable card, but its release is just timed bad - the majority of the people *don't* need to upgrade right now, and $400 is a lot of money. Maybe if it was released around the same time as the GF4 line..

 

And in case you're wondering, I have a 32mb GeForce2 MX400, and I'm waiting to upgrade for the new nVidia card (or the .13 rv300 with ddr-II, which according to andantech should compete head-to-head with it).

Link to comment
Share on other sites

After upgrading to an 8500 from a GF2 GTS, I'll never buy Nvidia again- price being the issue. Detonators are no better than Catalysts anymore- too many leaked drivers, doesn't always mean newer is better. Lots of posts on gaming forums where folks have problems and need to roll back their Detonators, just like with Catalysts.

 

Besides, the early scores are showing the 9700 kills the 4600 so bad, drivers aren't an issue- there's brute force to spare here.

 

My 8500 posts nominally equivalent scores to a Ti4200, and at a cheaper price. ATI is number two, they'll price competitively to gain market, like AMD versus Intel, and like Nvidia did with Voodoo back in the day.

 

In the end, no matter what camp you're in, we all win. Hopefully Matrox will also make a run, and bring back true competition (and better pricing) in the market.

Link to comment
Share on other sites

Originally posted by Absurd

 

16 players MP, at outdoor areas?

 

Yeah, unless I'm hosting the game, then it's not quite as great :)

 

edit: I don't think there will ever be one on the top, it just goes back and forth (as long as they continue to both be innovative companies)... ATI releases a killer... then NVIDIA releases a killer...etc...

Link to comment
Share on other sites

well now its a tough decision.... for me to buy a Geforce 4 ti 4600 or wait untill the new ATI video card comes out...hmmmmmm..... Well from what all you inteligent people have said, its a tough decision. however to that one guy absurd........whats wrong with you. You have a great comp and video card and you play at them most crappiest settings ever...... anyways... Which card is worth the money..........

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...