Announcement

Collapse
No announcement yet.

The AGEIA Extreme Physics mod pack?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by madafaka View Post
    It has been proven already that Software Physx that use processor core or gpu core in sli/crossfire to be way more efficient than hardware physx such as Ageia.
    Havoc, the most popular software physx is the most popular and wildly use than Ageia(SDK). More than 200 games use software compared to Ageia with less than 30 games supported.
    Games such as HalfLife 2 use software physx. If you are familiar with halflife 2, you notice that it has more advanced physx than any games supported by Ageia right now (except cellfactor maybe).
    Furthermore, you dont even experience lag in Software physx based game such as Halflife 2.

    By the way, Sony PS3 uses PhysX SDK. Therefore, those Ageia maps should logically be compatible on PS3 in terms of physx
    you should also note that NO ONE is doing physics in GPU, period. it's there, but no one has programmed for it. at least not yet.

    on TOP of that, it's only EFFECT physics, meaning it can ONLY handle physics that are just for show and that cannot affect gameplay (sure you can open a hole in the wall, but the flying debris itself can't hurt you).

    how much more efficient can a CPU be, if ( a ) it has to share all its cores/processing time with other threads and ( b ) has nowhere near the parallelism of a PPU?

    unfortunately for the PPU... the main problem is people only believe what they see (literally)... so developers have to concentrate on heavy graphics... doing physics for these same people means you have to SHOW a lot of physics (stuff flying around)... and that means you need a BEEFIER GPU... but people assume that the PPU is supposed to "alleviate" the GPU, which it isn't. it really does have to put MORE stress on the GPU.

    why would you want to sacrific any part of a GPU (even in SLI/Crossfire)?

    as for "Furthermore, you dont even experience lag in Software physx based game such as Halflife 2." OF COURSE NOT... DUH, they're programmed to be PLAYABLE... and that means, there's only enough physics to be noticed, but not enough to really shine. HL2 does NOT support ANY kind of physics hardware (including your "efficient GPUs")... so why would they implement enough physics to bog down a system?!? the stock of UT3 is designed similarly, not to bog down a system without a PPU. would you HONESTLY think they would do otherwise?

    think of a car. why do we use cars with less than 200 hp? are they the only cars that companies can build? hell no... they can easily build cars that can go over 1000 hp (at least one or two companies can)... so then, why? a car with 1k hp needs a stronger body. it also needs to be heavier (so it doesn't fly off the ground), balanced so it doesn't skid... and buckets load more gas. and as a consumer, you're only willing to pay about $30k. putting an engine that can dump 1k hp into a car body that can handle about 250 hp is like putting FULLY accurate physics (INTERACTING raindrops, wind, etc) that REQUIRES the latest SUPERCOMPUTERS (Big Blue, CRAY, etc), into your desktop pc.

    so they have to scale it down to work on desktop PCs... the goal of the PPU is to get you some more. is the physics in HL2 enough for you? well, apparently it is... but it's not for me.

    i want the kind of physics that goes "hey, empty 50 gallon drum, lets kick it" and it moves a little, since even though its empty, it's still heavy, and i'm human. at best, it'll topple over and maybe start rolling. and then when i kick a FULL 50 gallon drum... it moves only slightly or even just makes a noise, but then it'll move (more) if its hit with something with a bit more "bang" like a car (or a rocket, assuming it doesn't explode). and if i kick a cardboard box it doesn't go flying 20 feet or explode, instead my foot just makes a nice little hole in it.

    btw... you should call it by its correct NAME "physics" not "physx"... physx is just software that tries to mimic physics. and more specifically, the physx is specific software patented by Ageia.

    Leave a comment:


  • replied
    Try playing CTF-Tornado on the PhysX server online as a preview

    Originally posted by D-Hunter View Post
    I want to try it, but that file is HULKING. I've got 5 minutes remaining on the download, and that's long for my connection speed.
    Try playing CTF-Tornado on the PhysX server online as a preview. I have heard some say they can play it without the card, but FPS is real bad. CTF-Lighthouse will become unplayable shortly after things get going as your FPS will drop to single digits without the card. Even with the card, your FPS will be lower on these maps than non PhysX intensive maps. You really need a pretty high end machine in addition to the card to play these maps as FPS will drop pretty low at times. I consider my box to be a pretty high end older setup.

    My very obsolete specs;
    Opty 180 @ 2.6GHz (socket 939)
    DFI Infinity NF4 SLI (nforce 6.85 driver)
    2GB DDR400 (value ram)
    2 X 7950GT 570MHz 512MB SLI (169.04 driver, AFR2)
    BFG Ageia PhysX (1.1.1.14 driver)(~40dBA - LOUD FAN)
    SB Live 24 bit (1.004.0055 driver)
    XP64 pro SP2 until Linux port arrives (AMD Dual-Core Optimizer)
    1600x1200, 3-3 quality, 38-85 FPS (I set a 85 FPS cap => refresh rate)
    1600x1200, 3-3 quality, 26-70 FPS PhysX maps
    UT3 ver. 1.1

    Leave a comment:


  • replied
    I also read a rumour (so don't expect it as going to happen, its still a rumour without any proof) about nVidia considering to get into the desktop-processor-market as well on its own, definatly without Intel.

    But if thats true, even they don't have the expierience and it would take them several years to advance to the level of Intel/AMD are now and another few years to catch up with them, they would still kill AMD straight out of the business without any problems.

    AMD where able to compete with Intel before they bought ATI. Now they earned the problems of ATI, which had several problems keeping it up with nVidia anyways. They really made a wrong decision about buying them. They should have left ATI on its own to finally finish their own products like the Phenom without any errors, then they maybe would have been able to show Intel who's the boss and then they would still have gotten the chance of buying ATI when they can't resist it.

    Leave a comment:


  • replied
    How cool would that be?! Nvidia are worth around $19 billion which is about 1/10th of what Intel is worth.

    AMD are in trouble, they are worth $400 million LESS than it paid for ATI in July 2006 . That makes Intel 32 times the size of AMD (in monetary terms) and Intel are 4 times the size.

    Another nail in the AMD coffin (much like the nail hammered into the HD DVD coffin by Warner Bros). AMD failed.

    Leave a comment:


  • replied
    Originally posted by hasol View Post
    That must not happen!!
    Won't happen. NVIDIA is far too successful on its own to want to sell unless they were WAY overpaid. Considering that their stocks are doing extremely well and there's no reason to expend for them to do any worse.... the type of money we're talking about is probably more than NVIDIA will ever be worth, particularly since Intel intends on competing more in the videocard market shortly.

    Leave a comment:


  • replied
    he he....

    Leave a comment:


  • replied
    Originally posted by MTL View Post
    i heard a rumor that intel might buy out Nvidia..which would..you know...eliminate AMD/ATI off the map
    That must not happen!!

    Leave a comment:


  • replied
    Originally posted by Medu Salem View Post
    Havok got finally bought by Intel,
    oh snap really? guess im behind on my news.
    i heard a rumor that intel might buy out Nvidia..which would..you know...eliminate AMD/ATI off the map

    Leave a comment:


  • replied
    Originally posted by hasol View Post
    Interesting...
    Could I use one of the cores in my CPU to run the physics with Havoc? and the other core + graphics card to run the game itself? You can see my specs in my signature.

    Would that improve my FPS?
    I had this same idea. In windows control panel the Ageia physx software component has NO configuration options unless you are using their hardware card. I tried to bog down the interactive demos that are the only thing to mess with in Control Panel. It never dropped fps at all even after firing about 1000 bouncing balls in a cloth sim environment.
    I have at times set affinity of ut3 in task manager to only two cores. It runs just as fast at max graphics (1920*1200) as on 4 cores. I had hoped the physx jive could be assigned specific processors, but no.

    This is intentional on Ageia's part, no doubt. I had similar skepticism after seeing how well HL2:Ep2 ran during the final Strider Battle, with thousands of physics objects in simulation in a huge outdoor environment with Hunters, NPCmorons, The Awsome MuscleCar, Gravity Gun Fighting... So much **** going on, software Phys.

    My first UT3 install, the ragdolls and other physics looked terrible, quivering and shuddering and penetrating often completely inside the level geometry..

    I just did a fresh install on my seldom used XP64 OS. After updating the xp64 drivers for my 8800 gtx's, (tho i use dual monitors, so no SLI) and installing the patch before I even played UT3, it seems that a great deal of my lag and packetloss issues have been solved...and the ragdoll motion looks.. well, good, now. No wild / bouncy / flinging / erratic shudders. The game seems to work smoother overall. I'm not sure if it's just cleaner since it was patched "virgin" as opposed to originally where I tweaked the hell out of ini's and made maps and messed with the editor and crashed alot plus had the UT3demo still installed.... But xp64 is also addressing all 4 gigs of RAM in my system, so maybe UT is just flexing now that it has the room it wanted.

    either way, i'm happier. Been wondering if i'd ever find another use for xp64 after being turned off by bad quicktime implementation.... (I loooooove quicktime it's the 'deemer of movie players! )

    Leave a comment:


  • replied
    Originally posted by madafaka View Post
    It has been proven already that Software Physx that use processor core or gpu core in sli/crossfire to be way more efficient than hardware physx such as Ageia.
    Havoc, the most popular software physx is the most popular and wildly use than Ageia(SDK). More than 200 games use software compared to Ageia with less than 30 games supported.
    Games such as HalfLife 2 use software physx. If you are familiar with halflife 2, you notice that it has more advanced physx than any games supported by Ageia right now (except cellfactor maybe).
    Furthermore, you dont even experience lag in Software physx based game such as Halflife 2.
    Interesting...
    Could I use one of the cores in my CPU to run the physics with Havoc? and the other core + graphics card to run the game itself? You can see my specs in my signature.

    Would that improve my FPS?

    Leave a comment:


  • replied
    Man, I loved Cellfactor. If that had become a full game with good online play my physx card would be completely worthwhile.

    Leave a comment:


  • replied
    Havok got finally bought by Intel, therefore it will be worthless in the next few years, cause I just think they bought it to get one competetor less on the market. Havok planned to expand their physics engine to physics integrated gpu's, and Intel wouldn't sell that much processors anymore, cause since then every gpu could do it better than a processor and mostly Havok would have decided to do it with nVidia or AMD's ATI instead of Intel's out-of-the-range-graphics-solutions. Now they can force to use either their processor architecture or their graphics solution, causing their hardware to be more powerful at a price I wouldn't accept.

    Since Ageia's engine is for free, and its scalable to every computer configuration thats available, like single-processor, multi-processor and their hardware PhysX-card, I also would decide for them, instead of Havok.
    That mappack where never thought to run fine on any other configuration than with that PhysX-card, but thats stated on their page, and within the ReadMe:

    Playing without a PhysX card: This mod-pack is available for free and can be installed and played without a PhysX card in the system. However, the minimum system requirements anticipate a PhysX Processor being present and it is likely that non-PhysX Processor systems will experience severe performance degradation at times of high physics load (action in the game). This degradation will not be present at all moments, but should be clearly evident during standard play.
    Even they wrote it, they don't force someone to pay an extra amount of money for the card. It's stated they do it for gaming enthusiasts, which don't mind about paying an extra of 90$, cause they mostly have a configuration worth about 2500-3000$ anyways.

    For example, Cellfactor, anybody should tell me how to do a game like that with Havok... It's seemingly impossible to do. Its even stated right here, the cpu isn't that good at physics calculations, it may be faster than hardware enabled physics when there are only about 25 boxes around the level, mostly never colliding with each other, caused of their rarity. Okay well then software calculated physics would be faster, combining things like spended processor-time on handling the PhysX-card, amount of calculations needed to be done, eg. Thats the main reason why Source's integrated Havok is fast enough without any physics-hardware. If you instead would use an amount of 5.000 objects, like in Cellfactor, where it is needed for the key-gameplay, then HalfLife² with same features would look more like a diashow than anything else.

    In my opinion I even don't like how Valve implemented Havok into the Source-engine anyways. Well everybody thought "Yeah, wow, nice features", but when I thought about kicking around boxes, I don't like it when they fly around 50m or something like that (okay thats a little bit exaggerated, but everyone who played it should know what I mean), cause the engine doesn't know something like inertia. Every game with Havok's engine felt like there's no difference with an object's mass (Well, I don't know how its like in BioShock, maybe there its better, but even I've got it, I didn't have time to play it). If you kicked around a small melone it feels the same like throwing around a container weighting about two tons. Hell, even Doom³'s physics-engine had inertia, so it felt more realstic, even "The Chronicles of Riddick: Escape from Butcher Bay", which hit the stores right 6 months before HL² had it, why HalfLife² can't do it?
    And even HalfLife² was nothing more than a physics-demo either and they just continue this line with episode 1 and 2, and maybe even in 3, even I think "Well, we ALL KNOW about your physics-engine now, can you do something else than that too?"
    It's getting borring to solve weird riddles that would never happen in reallife, like putting bricks on a seesaw to advance, and coincidentally the needed amount of bricks is right next to it. Or like diving a barrel of 169l filled with O2 under water, which nobody, not the mightiest guy, would be possible be able to achieve anyways, caused of lift force. But well... If someone likes that kind of unrealistic physics-enabled-games, these people should avoid Ageia and not buy a physx-card :>

    And Crysis is well a hell out of discussion with their minima system specs and what you should have for maxima they even don't know themselfes. (The minima aren't even stated on the game's packaging and also not in the booklet which refers to the packaging :> It's just nowhere, I just know them from their official statement.) And their engine isn't that good-looking too, too much polygons and shader-effects making the game looking like you're playing with plastic-dolls or something like that. Even its on dx10 and I saw many screenshots that are pushed to the maxima, I still think Gears of War and UT3 are looking way better compared to the system specs. And as noticed from the Crysis community, there are mostly no differences between dx10 and dx9 on maximum anyways, if you tweaked it right, but with the difference that it runs on dx9 much better than on dx10, which leads to the fact, they did something wrong with that game :x

    And that maps are fully PhysX-enabled for online-play, just to state it Even UT3 is on dx9 and Crysis does it only on dx10. But if you talk about "Who can play that maps online anyways?" ... I think about "Yep, if we could fill an all in all 5on5 server with that maps, I would be gladly.", but when I think about Crysis, I think its pretty equaled to it

    Leave a comment:


  • replied
    Originally posted by madafaka View Post
    I'm not sure if Ageia work online. suppose that you have the hardware while others don't.
    I know that in HL2 and crysis in witch the physic is software, your imput to environment will be visible by others
    Source physics are helluva laggy online and Crysis doesn't allow physics in online play (except on "DX 10 servers" for some ****ing weird reason).

    Something like PhysX PPU should be integrated into motherboards rather than sold as seperate cards. It's a lot better at physics than our processors. A CPU really isn't a vector calculation machine. That's the reason we have dedicated GPU's too.

    Leave a comment:


  • replied
    They should no be bought out but instead allow video card companies to add their CPU to the video. This way there is no other card required.

    Leave a comment:


  • replied
    i wonder how that company is doing with not many games using their tech. maybe AMD/ATI should buy AGEIA? then they could make some fancy video cards *shrug

    Leave a comment:

Working...
X