Jump to content

Home

Tweak JK II to its MAX PERFORMANCE! L@@K This!


hvydarktrooper

Recommended Posts

800x600

16 bit color (human eyes cant distinguish much more than 16,500 colors)

(16 bit is about 16,000 and 32 bit overkill:rolleyes: ,go figure)

dynamic lights

trilenear blah blah blah... you get the picture

i get ~40 fps

(my game boy advance can do that w/doom:D :rolleyes: )

 

comp =

Dell PIII L series (L550r @ 550 mrgahertz):cool:

256 megabytes ram:cool:

geforce 2 mx200 "Xtasy" 32 megabyte (PCI):cool:

Altec lansing speakers with a big box (i dont really care 'bout sound:rolleyes: )

Window 98:D

 

Can i get higher FPS?

Link to comment
Share on other sites

Yeah, it's better to set maxfps to a value slightly higher than your average fps. Set it to a large number and it'll take harder hits when there's a lot of action, set it to something more conservative and it'll be a lot more consistant. In some situations it can even help program stability, but not all.

 

-Kaotic

Link to comment
Share on other sites

com_maxfps is used to cap your framerate so that your cpu is freed up to do more important work. Setting this to anything higher than 80 is totally pointless as the human eye perceives true motion at only 60 fps. People setting their com_maxfps to 1000 are just going to force their processor to do a lot of extra number crunching that doesn't have any perceptible difference on-screen. If you're getting over 80fps consistently, then up your detail levels until your framerate just barely dips below 80 during heavy fight sequences. That way, you're getting the best of both worlds.

 

I think people would benefit more from setting their displayRefresh property to the maximum value for their monitor in that resolution and color depth.

Link to comment
Share on other sites

ArtifeX is exactly right here.

There is no point of setting the Maxfps value to over 100.

The human eye cannot catch up framerates higher than 27 frames per second. So its useless if it is 35 fps or 150 fps. The cpu will just have to jump from 120 range fps down to maybe 30 fps in heavy fighting, that will make your performance worse.

 

This is just something done by people because it maybe feels good to get out an extra 5-10 fps, but your eye wont catch it anyway.

Link to comment
Share on other sites

Originally posted by CreeP_303

is there anyway to tweak a really HORRIBLE system?

 

specs.

 

 

Celeron 375A

320MEGS of RAM

ASUS Riva TNT2 M64

 

 

OH MY GOD THAT IS THE WORST COMPUTER IN THE WORLD!!! HOW THE HELL DID YOU EVEN INSTALL JK2???

Link to comment
Share on other sites

About the color thing.. there is a difference in computers when you see lights and effects or textures with "Banding" in them. There is a difference, if you really take the time to examine your game you will see it.

 

When I had a p3 450 mhz and playing Max Payne I THOUGHT there was no difference in 32bit or 16bit until I noticed heavy banding in the subway lights and fog areas. When I got a faster machine Max Payne ran/look so much better.

 

I think people should put more effort into either playing hte game at high resolutions with 32 bit color or 800x600 or 1024x768 with anti-aliasing.. it looks sharp!

Link to comment
Share on other sites

AnabolicJedi:

 

I'm not to sure if the human eye can catch anything more that 27 frames per second regurlarly, but on computer games you sure as hell notice a HUGE (and I mean HUGE) difference between 27 fps and 85.

 

Mobius47:

 

I don't know where you got that info, but it's all wrong.

The human eye can see up to something over 5 million (!) colors. 16 bit depth is a bit over 65000 colors and 32 bit is 16.7 million* colors*, so you're alot better of using 32 bit. It's not such a huge performance hit, though of course it depends on your hardware.

 

*The reason this number is so much higher than what the eye can see is that it's easier for the computer to calculate it (or something).

Link to comment
Share on other sites

Ok, I've been trying to practice for that Gateway tournament that is next month. Now when I load a game with their predetermined settings my FPS drops from 85fps to less than 30fps. I assume its because there is 12 other bots in the game but damn that much? I have my video set to 800X600 32bit, bilinear, basically all my setting are middle of the road....Anyone have any suggestions?

 

I'm running a Geforce2 MX 400

Link to comment
Share on other sites

Originally posted by noaxark

AnabolicJedi:

 

I'm not to sure if the human eye can catch anything more that 27 frames per second regurlarly, but on computer games you sure as hell notice a HUGE (and I mean HUGE) difference between 27 fps and 85.

 

Mobius47:

 

I don't know where you got that info, but it's all wrong.

The human eye can see up to something over 5 million (!) colors. 16 bit depth is a bit over 65000 colors and 32 bit is 16.7 million* colors*, so you're alot better of using 32 bit. It's not such a huge performance hit, though of course it depends on your hardware.

 

*The reason this number is so much higher than what the eye can see is that it's easier for the computer to calculate it (or something).

 

in regard to fps, the human eye in psychological testing can detect differences in frame rate up to around 80 fps. though at a stable 27 fps the noticeable of 5 fps either way is not too noticeable. The difference though between 27 and 85 fps is noticeable, mostly because difference in speed is noted very quickly by the human eye when it drops (not so much when it increases). As to colour reference yes the human eye has a spectrum of over 5 million colours.

Link to comment
Share on other sites

In regards to framerate-

 

The reason that computer games need higher frame rates than televison (30FPS) is because cameras blur the action and your brain 'fills in the blanks' easier.

 

Video games create perfectly crisp images so any skipping of movement between frames is much more noticeable.

 

Ever notice that Saving Private Ryan effect that was popular for a couple years in action movies? They used it in action sequences in Gladiator also. They shoot that with a high speed shutter so each frame is a crisp shot with little motion blur. That is what gives those action scenes a slight strobe effect. You really notice 30FPS with a fast shutter speed.

 

When video cards can do motion blur reliably the frame rates can drop considerably and still look good.

 

I have a P3 750, 512 PC133, GF3 TI200 and the game runs great at high detail. The best of any new game I've played. It does chug on big battles but duels run flawlessly.

Link to comment
Share on other sites

guys..

the reason you need a fast framerate ,

is because of something called "persistance of vision"

I'm surprised this hasn't been mentioned.

 

(I hear a lot of talk about the performance limitations inherant to the human eye)

 

((but .. consider for a monent.. the performance limiations inherant to your display hardware))

 

 

 

 

 

 

your eyes don't respond nearly as 'quickly' to stimulation ,

as the phosphor coating on the inside of a CRT responds

 

as a result :

1) your retina will "hold" an image a lot longer than the phosphor coating on the inside of a CRT

(therefore : the retina will retain 'afterimages')

((which is why you often see 'spots' before your eyes

immediately after the policeman shines that damn flashlight in your face))

 

2) it takes a bit longer for the eye to register what it is seeing

((which is why the eye actually needs to be exposed to the image for a longer period of time than the phosphor))

((which is also why people ~think~ that "the human eye can't see more than 27fps"))

 

 

 

 

consider this:

the CRT in your television refreshes the picture at a rate of 60hz (right?)

but television signal is actually filmed/broadcast at a framerate of only 30fps

(hmm)

 

this means that every second "image" being displayed on your TV is identical.

(why?)

 

because an individual frame on the CRT would always fade too quickly from the phosphor,

before your eye could fully register what's being displayed.

 

therefore each image needs to be displayed twice

 

(ie: the phosphor coating on the inside of the CRT cannot "hold" the image for quite as long as the human retina requires , in order to fully register the image)

((otherwise , the individual frames would fade too quickly for you to see them;

your eye would only register a transparant 'ghost' image, with blurry motion))

 

 

 

 

 

ALSO consider:

the CRT in your computer monitor has a phosphor coating

which reacts MUCH MORE QUICKLY to electrical stimulation,

than the phosphor coating in a common television

 

this means that the image burned into the phosphor will fade even MORE QUICKLY on a computer screen, that it would on your TV set

 

(in fact , studies have shown that electrical stimulation at 60hz is NOT ENOUGH to create a crisp lifelike image in VGA)

((and thats why 60hz flickers, it looks 'transparant' , and it gives you a headache))

((you need to have a refresh rate of at least 85hz (100 is nice)

in order to see a crisp lifelike image of your windows desktop))

 

 

 

 

 

 

conclusion : what's the perfect framerate?

(it depends on your monitor ; not your "human eye")

 

the perfect framerate for 3D games is an integer factor of the monitor's refresh rate

 

1/2 of the refresh rate is adequate ,

as is the case with common "television"

((but 1/1 of your refresh rate is preferred))

 

so then : if your monitor is set to a resfrsh rate of 100hz ,

then your Human Eye will be "adequately" stimulated by a framerate of 50fps

((but a framerate of 100fps is preferred))

 

 

anything less is NOT ENOUGH to create the illusion of actual motion

 

(given the performance limitations inherant to your monitor's CRT)

 

(regardless of the performance limitations inherant to yur "human eye")

Link to comment
Share on other sites

Originally posted by Cobalt60

guys..

the reason you need a fast framerate ,

is because of something called "persistance of vision"

I'm surprised this hasn't been mentioned.

 

(I hear a lot of talk about the performance limitaions inherant to the human eye)

 

((but .. consider for a monent.. the performance limiations inherant to your display hardware))

 

 

your eyes don't respond nearly as 'quickly' to stimulation ,

as the phosphor coating on the inside of a CRT responds

 

as a result :

1) your retina will "hold" an image a lot longer than the phosphor coating on the inside of a CRT

(therefore : the retina will retain 'afterimages')

((which is why you often see 'spots' before your eyes

immediately after the policeman shines that damn flashlight in your face))

 

2) it takes a bit longer for the eye to register what it is seeing

((which is why the eye actually needs to be exposed to the image for a longer period of time than the phosphor))

((which is also why people ~think~ that "the human eye can't see more than 27fps"))

 

 

consider this:

the CRT in your television refreshes the picture at a rate of 60hz (right?)

but television signal is actually filmed/broadcast at a framerate of only 30fps

(hmm)

 

this means that every second "image" being displayed on your TV is identical.

(why?)

 

because an individual frame on the CRT would always fade too quickly from the phosphor,

before your eye could fully register what's being displayed.

 

(ie: the phosphor coating on the inside of the CRT cannot "hold" the image for quite as long as the human retina requires , in order to fully register the image)

(therefore each image needs to be displayed twice)

 

((otherwise , the individual frames would fade too quickly for you to see them;

your eye would only register a transparant 'ghost' image, with blurry motion))

 

 

ALSO consider:

the CRT in your computer monitor has a phosphor coating

which reacts MUCH MORE QUICKLY to electrical stimulation,

than the phosphor coating in a common television

 

this means that the image burned into the phosphor will fade even MORE QUICKLY on a computer screen, that it would on your TV set

 

(in fact , studies have shown that electrical stimulation at 60hz is NOT ENOUGH to create a crisp lifelike image in VGA)

((and thats why 60hz flickers, it looks 'transparant' , and it gives you a headache))

((you need to have a refresh rate of at least 85hz (100 is nice)

in order to see a crisp lifelike image of your windows desktop))

 

 

conclusion : what's the perfect framerate?

(it depends on your monitor ; not your "human eye")

 

the perfect framerate for 3D games is an integer factor of the monitor's refresh rate

 

1/2 of the refresh rate is adequate , as is the case with common "television"

 

(so : if your monitor is set to a resfrsh rate of 100hz ,

then your "Human Eye" will be adequately stimulated by a framerate of 50fps)

(anything less is NOT sufficient to create the illusion of actual motion)

 

((given the performance limitations inherant to your monitor's CRT))

((regardless of the performance limitations inherant to yur "human eye"))

 

Good explanation Jedi......Yoda would be proud

Link to comment
Share on other sites

-cheers-

 

 

 

and btw Chastan is correct about MP play.

 

here's a quote from the man (in this case Kenn Hoekstra):

http://www.webdog.org/plans/173/

"Thursday, April 25th, 2002 - For those of you having

some ping/performance issues in Jedi multiplayer, I

can offer the following advice...

- If you have a really fast machine, you can cap your

frames per second (defaults to 85) using the com_maxFPS

command. I'd recommend capping it at 50 or 60."

 

 

 

((personally I'd recommend capping at 50fps for MP if your monitor is refreshing at 100hz ; see explanation above))

Link to comment
Share on other sites

I don't know, but even above any supposed "limititaions of FPS" that the eye can see I do notice a difference in FPS. It just looks a lot more smoother. I don't really think there's much need to go higher than 100FPS though, it looks pretty good there. I think that really it just has to do more with the timing of when the frame updates and when your brain catches it. I don't really know, I'm no expert :D

 

 

As for refresh rates and such, I always notice when people have their monitor set at 60hz... it really bugs me. It's the first thing I notice when looking at a monitor... I always go "fix your refresh rate" but it seems like other people can't even tell... weird..

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...