Lord_FinnSon Posted December 29, 2001 Share Posted December 29, 2001 You mean with MAX quality settings? I guess you are right about that, but these games have so many options that you can lower down, if you don't get enough frames with your graphics card; you might also wanna use(if you haven't already done so) applications like NVmax and/or Powerstrip to do some additional tweaking with your cards settings. Of course I have to admit that you should always have a possibility to play every game with their highest quality settings(without too many frame drops) like they were meant to be, but that would simply mean you have to buy a new Nvidia/Ati graphics card every year. As a sidenote, I'm also going to upgrade my PC quite soon, because after almost three years, my good old Riva TNT just can't do the job; I was able to extend its lifespan only by playing with max performance settings, while dreaming about newer, bigger card that could some day show me every possible little detail. Link to comment Share on other sites More sharing options...
Sherack Nhar Posted December 29, 2001 Share Posted December 29, 2001 Originally posted by Lord_FinnSon Of course I have to admit that you should always have a possibility to play every game with their highest quality settings(without too many frame drops) like they were meant to be, but that would simply mean you have to buy a new Nvidia/Ati graphics card every year. That's not entirely correct. John Carmack's engines has always been known to be incredibly demanding. If you have a graphic card that runs Carmack's engines at their highest settings (with a 60+ framerate) then you can run about anything that's on the market right now. BTW, nice avatar Link to comment Share on other sites More sharing options...
Nob Akimoto Posted December 29, 2001 Share Posted December 29, 2001 Originally posted by Sherack Nhar EDIT: If your guys want to get a GF3 Ti500 or a RADEON 8500, I sure hope you don't want to play DooM3 or Quake 4... those cards are hardly able to run the new DooM engine. You base your claim about D3/Q4 on what? The tech demos for Doom3 were run on a GeForce3, I hardly would say that's "hardly" running a game's engine. Especially considering the unoptimised state such a demo would be in. While I don't quesetion id Software's ability to create new game engines that continously push the envelope in terms of performance stress, it should be pointed out that the main sell point of these games is exactly so, as a game. No developer, not even one with the clout of id and Carmack would release something that would obsolete the mid-end(as of H2 2002), nevermind the rest of the generations behind it. Considering the D3 itself is coded to take advantage of the NV20 architecture, I really can't see how it wouldn't run said arch at a playable framerate at a medium/higher resolution. The only real bottleneck is texture memory, and unless DooM3 uses ungodly sized textures per scene and push more than 50,000 polys I really can't see how it'd crawl it down to a halt. For all intents and purposes while in terms of overall capability id engines do show off a great deal of flash they're hardly the most demanding things on the market. If you really want demanding, check out professional 3D apps, now THOSE are demanding. Link to comment Share on other sites More sharing options...
Sherack Nhar Posted December 29, 2001 Share Posted December 29, 2001 Sorry for being so imprecise, here is the full story: Take this link and go see "The Carmack- Part 2" Here are various quotes by The Man himself: The low end of our supported platforms will be a GF1 / 64 bit GF2Go / Radeon, and it is expected to chug a bit there, even with everything cut down. This one shows that Carmack is clearly not aiming at introducing the New DooM to the mainstream crowd. However, keep in mind that DooM3 is still a long way from being released. Even so, it shows that mass-market accessibility is just not a concern to good ol' John We are aiming to have a GF3 run Doom with all features enabled at 30 fps. We expect the high end cards at the time of release to run it at 60+ fps with improved quality. The "high end" cards he's referring to are probably going to be released in spring. The reason that the New DooM engine is so demanding is that the lighting model that Carmack has created is just so darned complex that it brings "older" videocards like GF2 to their knees. You should check out that Carmack interview on Gamespot under their DooM3 coverage. I hope I've been clear enough this time Link to comment Share on other sites More sharing options...
ed_silvergun Posted December 30, 2001 Share Posted December 30, 2001 Bear in mind that the DooM 3 demos that were exhibited were probably running on high-end PCs which might well not be available to the mainstream market yet. GeForce3 will not run the new engine from id very well at all. Yes, you'll get a playable framerate with some graphics cut down, but if you want to play it as it was intended you will need a GeForce4 or equivalent. Remember that the new DooM game is not going to be out for about another year, maybe more, and that GeForce3 will be nearly two year-old technology by then. That's pretty old by computer standards! Yes, AGP 8x will run on a 4x bus, you just won't get the extra speed benefits. Link to comment Share on other sites More sharing options...
Darth_Lando Posted December 31, 2001 Share Posted December 31, 2001 LOL. Even though DOOM 3 will only play at 60fps on the fastest cards on the market when it does come out, everyone will be saying to wait another 6 months because the next generation of 3d cards will be out and will be able to play the game at 100 fps. then when those come out people will say "wait 6 more months then you can play it at 130 fps on the next generation cards..." ect :D Link to comment Share on other sites More sharing options...
Vagabond Posted December 31, 2001 Share Posted December 31, 2001 Well, regardless of what card will run Doom 3 acceptably, I doubt I'll be buying any "high-end" video cards now, or in the future. Spending over $200 for a video card, while easily within my budget, just doesn't offer a significant ROI, from my point of view. For example, I've recently purchased a VisionTek Xtasy GeForce 3 Ti-200, for the computer I'm building, for only $199. That card offers the functionality of the higher end GeForce 3 cards, and excellent speed for a relatively reasonable price. Sure, I could have spent an extra $150 on the Ti-500 for what? An extra 20 frames per second? That's just not worth it to me. If Doom 3 can't run on anything but a GeForce 3 and above, then it is likely that Mr. Carmack will be somewhat disappointed in his sales figures. Link to comment Share on other sites More sharing options...
psycoglass Posted December 31, 2001 Share Posted December 31, 2001 Whats the difference between a AGP 4x and A AGP Pro slot. I don't believe any of the Geforce 4 news until I see it on a news site or Nvidia's press release. Link to comment Share on other sites More sharing options...
The_Phantasm Posted January 2, 2002 Share Posted January 2, 2002 "GeForce 4 info........... GeForce4 Ti 1000. This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz. GeForce4 Ti 500. This is a bit slower solution with around 275MHz chip frequency and 600MHz memory frequency. Although it will have AGP 4x interface, the card will still come with 128MB graphics memory. GeForce4 MX 460. This is the eldest representative of the GeForce4 MX family. It will probably have 4 rendering pipelines and DirectX 8-compliant T&L unit. The amount of DDR graphics memory used (with 128bit bus) will be cut down to 64MB, and its working frequency will be reduced down to 550MHz. The core will work at 300MHz. GeForce4 MX 440. These cards will go with 64MB DDR SDRAM with 128bit access bus. The memory working frequency will be 400MHz and the core frequency – 275MHz. GeForce4 MX 420. According to the available data, this GeForce4 MX version will be targeted for the Low-End market that is why the cards built on it will have 64MB SDR SDRAM memory working at 166MHz. The core will work at 250MHz. Besides, It looks as if there were only two rendering pipelines in this modification." Again this is speculation, I've also seen this on the IGN forums. Link to comment Share on other sites More sharing options...
Nob Akimoto Posted January 2, 2002 Share Posted January 2, 2002 The originator of this speculation is Xbit hardware.(At least to my knowledge.) It'd be nice if people would actually post the SOURCE of news clips from hereon out. It's the very least one can do. It's just a matter of decency. Link to comment Share on other sites More sharing options...
digl Posted January 2, 2002 Share Posted January 2, 2002 I also read it at Xbit hardware long ago, they possibly originated this Its a rumor, thats why there is no source, if the source was someone working at Nvidia he would be fired at once if his name is published all around the web with the specs of the announced cards Link to comment Share on other sites More sharing options...
StephenG Posted January 2, 2002 Share Posted January 2, 2002 Originally posted by Vagabond If Doom 3 can't run on anything but a GeForce 3 and above, then it is likely that Mr. Carmack will be somewhat disappointed in his sales figures. when is Doom3 coming out? by the time D3 comes the gf4 will be out or maybe the gf5????? who knows. the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3. who still uses TNT1 or the ones b4 it? Link to comment Share on other sites More sharing options...
digl Posted January 2, 2002 Share Posted January 2, 2002 the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3 Im not sure of that, but things go so fast that Its possible Link to comment Share on other sites More sharing options...
Nob Akimoto Posted January 2, 2002 Share Posted January 2, 2002 Originally posted by StephenG when is Doom3 coming out? by the time D3 comes the gf4 will be out or maybe the gf5????? who knows. the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3. who still uses TNT1 or the ones b4 it? I see you're not very well versed in the PC industry then. While a product cycle of a particular graphics manufacturer is 6-10 months(horrible horrible cycle btw...), the turn around time for the AVERAGE PC user is in the order of 3-5 years. Quake3 for all intents though selling well(considering it did have much more reasonable minimum specs compared to it's next available successor, D3) While I don't know anyone using the Riva128 chipset(it was a crappy piece of **** card to begin with), I know quite a few casual gamers who still use TNT's. Right now I'd say a GF256(yes that old card) as minimum would be the most reasonable for ANY game that wants to have mass market appeal. The truth of the matter is Q3 and it's brethen overall probably sold very FEW copies compared to titles such as The Sims, or the Tycoon series which do not require $400 ridiculously overpriced, overhyped pieces of silicon to function. Games sell very few copies when they set outrageous minimum requirements. At least during their most profitable times. Which is why publishers will jack up the price to compensate. Sorry, I'm not paying $400 then an additional $20 over the standard $40 to play some overhyped no-game-depth flash fest... Link to comment Share on other sites More sharing options...
StephenG Posted January 2, 2002 Share Posted January 2, 2002 Originally posted by Nob Akimoto Sorry, I'm not paying $400 then an additional $20 over the standard $40 to play some overhyped no-game-depth flash fest... i'm not saying u have to, just when new stuff comes out prices of the old stuff goes down, in time. not even i'm gonna put $900australian dollars into a vid card. i got my gf2ultra cheap, very cheap after about 3 months after the gf3 release. D3 is very far from release, i still reckon when it comes out most pc gamers will have something like a gf3. The 3d graphics card industry is moving very very fast. Link to comment Share on other sites More sharing options...
digl Posted January 4, 2002 Share Posted January 4, 2002 http://www.msnbc.com/news/681639.asp those guys were the source of the GF4 info Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.