Announcement

Collapse
No announcement yet.

New Tim Sweeny interview, mostly about tech-specs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by Drintion View Post

    If anybody can actually explain WHY, instead of just passing the word on like many forums before this that it simply is "bad" that'll be nice. (Don't just mention its a game port T_T).

    Well, that's one of the main reasons. It's a poorly coded port of x360 code over to the PC world. This game was written to run at a maximum of 30 frames a second. Because thats what a x360 outputs. Granted, with all the extra horsepower that the new cards afford over the 48 pipe xenos, you'll get over 30. . . but the game is always trying to optimise for 30 frames.

    Further, AA + HDR is only allowed in UE3 under DirectX 10. DX10 will allow alot of speed increase as Epic is using the extra shader depth not to up the visuals, but to increase performance.

    You won't be able to get a straight up PC benchie of UE3 in action until UT3 drops. That will be the first time we see this awesome engine truely unleashed. I'm expecting great things here, seeing as a high-end PC could wipe the floor with any console out there. Hell, my PC could whip all 3 consoles at once

    Leave a comment:


  • replied
    Originally posted by Spoudazo View Post
    You are comparing a Ubisoft game port, a horrible one at that? Go read what Tim Sweeney said, and didn't you not catch that they just all started using Dell XPS 8800gtx-powered PCs?
    I don't know where it started (nor can I confirm that its true) but I have to say that every forum that I come across that regards using Rainbow 6 Las Vegas as a benchmark for UE3 has one thing to say: Terrible way to benchmark UE3

    If anybody can actually explain WHY, instead of just passing the word on like many forums before this that it simply is "bad" that'll be nice. (Don't just mention its a game port T_T)

    But from what I've seen of the 2900 XT, its drivers are really immature, and I believe they chose a super scaler architecture that relies on high power efficiencies to deliver better results (Please correct me if I'm wrong [dont cite wikipedia]) But that would equate to only improvements in its performance from here on out.

    But ATI isn't competing against Nvidia in the High range market right now, they're aiming mid range, and stated that 400 dollars is the most anyone would actually want to spend on a video card, and thats where they're aiming.

    Leave a comment:


  • replied
    Originally posted by Benfica View Post
    It is a waste. The X2900XT may be lame on some games with 4xAA but it is much better on the existing UE3 game. The GTS320 is ok because it's 280€ vs 390. But forget about the GTX. You will feel ripped off unless NVidia improves UE3 performance.

    It's 66fps for the X2900 and 53 for the 8800gtx at 1280x1024: http://www.legitreviews.com/article/503/8/
    Here it's 57 vs 52: http://www.legitreviews.com/article/503/8/
    You are comparing a Ubisoft game port, a horrible one at that? Go read what Tim Sweeney said, and didn't you not catch that they just all started using Dell XPS 8800gtx-powered PCs?
    Tim Sweeney: The relative performance scores between NVidia’s and ATI’s best cards vary from day to day as we implement new optimizations. But, for the past year, NVidia hardware has been ahead fairly consistently, and a few months ago we standardized on Dell XPS machines with GeForce 8800 GTX’s for all of our development machines at Epic.

    Leave a comment:


  • replied
    Originally posted by roadrash View Post
    But using your car analogy if you don't have the money to join the car club stay in the power wheels club.... buy the game on console!
    And that sir is why you might want to stay out of the PC gaming industry. You would lose sales for your company and be out of work untill you realize that you have to market a realistic return. Upper-Midrange/High End systems are only a small percentage of the return in any game.

    But lets look at your statement. The only consol that you can really play a FPS on is the PS3 as it allows for a keyboard and mouse. Assuming the game is coded to accept it. So after the PS3, the game, the internet subscription(being pay to play) the peripherals, you are right back up to the price of a 8800 again, maybe more since you have to pay for the online subscription.
    Hows that a solution?

    Before calling me kid, you might want to be over 70, and before replying to my posts, you might want to drop the "I dont care about the community, I got mine" sentiments. The reason I post is because Im concerned about the player population, which you do not seem to be.

    Leave a comment:


  • replied
    Originally posted by roadrash
    R6 vegas is a horrible game to ever use as a benchmark
    It is a good or bad benchmark as any other. You can benchmark correctly if you use lots of them.
    and you should get a swift kick in the *** for using it.

    It's junk and really doesn't reflect much. It's also a horrible console part and was made for ATi hardware.
    Sorry but it's currently the only junk that can evaluate UE3.
    If you can't see how that makes it a bad benchmark...
    That's not my point. A card that costs 70% of the price and is 20 to 40% faster on a game that has the same engine of the game that I want, would always make me feel ripped off. Being a bad benchmark doesn't invalidate the fact that you may really have that price/performance ratio on UT3. I just posted this so that peeps don't rush.
    But I see where this thread would go, so let me tell you that have only NVidia cards

    Originally posted by roadrash
    if you don't have the money to ... buy the game on console!
    This argument is flawed. If you don't have the money, you don't throw it to the garbage can. 600€ for a PS3 or 420€+Live! subscription is a waste. Then you may want mouse+KB. Soon it will be an external HD. Then why would you dump 700€ on a device just to play a few games?

    Leave a comment:


  • replied
    It seems like some of you have no clue how SLI works and have never used, which explains why your opinions about it are not only wrong, but not based in reality.

    Get out the text books folks it's time to go to video cars 101 class

    Its my point of course when using 2 6800Ultras for SLI. I can only see the use of SLI when using a dual 8800GTS or dual 8800GTX/Ultra. 2 7900Gtx is about the same as 1 GTS or GTX so its also pointless to go SLI with even that.
    Anybody using SLI is going to be upgrading most video card cycles. You sell the old cards as soon as they can't keep up, or you give the old rig to mom and dad to web browse.

    So really the only people currently using dual 6800's in SLI are still doing it because that's all the horse power they need, and that's what was available to buy at the time.

    Let's also make one thing clear, 2 7900gtx are not going to be as fast as a single 8800gts in almost any situation. For starters SLI doesn't double the power of whatever GPU you're using. It really only matters at extremely high resolutions with high AA and maxed out details. So unless you have a monitor capable of spitting out 2046x1536 or 2560x1600 you're really not going to see much of a gain. In fact since SLI makes your system all the more CPU bound SLI will often give you less FPS at the lower resolutions people commonly run.

    Add the hefty price tag of two brand name 8800s and I LMAO at people who get them when they simply arent needed. Its like a fat, bald guy buying a Porsche. Even sadder because its online gaming.
    Sour grapes, ***** envy, I could go on. Keep pillow biting there kid.

    But using your car analogy if you don't have the money to join the car club stay in the power wheels club.... buy the game on console!

    You guys can post the power and capibilities of this and that. In two years, when that power is fully utilized, and the GF9800 is out and your current 8800s are midrange, another thousand or so dollars in un needed uprgrades will happen again and the same 10% of gamers willl happily fork out the cash for high end systems and make the same arguments.
    Or they can ebay the cards and recover half the money, not sell the cards and keep playing games, or give the old computer to the kid.

    Graphics and graphic guality arent everything. IMO, they hinder more than they help in terms of sales and player population which reflect entertainment value, continuing support, player made content, and community interest.
    Graphics aren't everything, but the help sales, in fact they drive sales. Other then UT fans the game will sell off it's graphics. If they make the graphics behind the times for the sake of using older hardware it won't sell and the game will fail

    Benfica's Avatar

    Join Date: Jul 2006
    Posts: 77

    Default
    It is a waste. The X2900XT may be lame on some games with 4xAA but it is much better on the existing UE3 game. The GTS320 is ok because it's 280€ vs 390. But forget about the GTX. You will feel ripped off unless NVidia improves UE3 performance.

    It's 66fps for the X2900 and 53 for the 8800gtx at 1280x1024: http://www.legitreviews.com/article/503/8/
    Here it's 57 vs 52: http://www.legitreviews.com/article/503/8/
    R6 vegas is a horrible game to ever use as a benchmark, and you should get a swift kick in the *** for using it. It's junk and really doesn't reflect much. It's also a horrible console part and was made for ATi hardware.

    If you can't see how that makes it a bad benchmark...

    Leave a comment:


  • replied
    It is a waste. The X2900XT may be lame on some games with 4xAA but it is much better on the existing UE3 game. The GTS320 is ok because it's 280€ vs 390. But forget about the GTX. You will feel ripped off unless NVidia improves UE3 performance.

    It's 66fps for the X2900 and 53 for the 8800gtx at 1280x1024: http://www.legitreviews.com/article/503/8/
    Here it's 57 vs 52: http://www.legitreviews.com/article/503/8/

    Leave a comment:


  • replied
    You can get an 8800gts for under $300 after rebate from Newegg,
    and a 8800gtx for a little less than $500, not to mention buying used on Ebay.

    I payed around $600 for my 8800gtx, and it's an awesome card and not nearly a waste. The 8-series is still DX10 capable and also has better anisotropic filtering, so either way the 8-series comes out ahead.

    Leave a comment:


  • replied
    You know, thats not exactly true. You guys are basing this on high end games of which there are only currently a few, not mainstream. If someone can get two 7600s for $300, or a 8800 for $500, what do you think they are going to choose? With the exception of very few games, the dual 7600s are going to be the better bet because most of the power of a single 8800 is a waste for current games. Add the hefty price tag of two brand name 8800s and I LMAO at people who get them when they simply arent needed. Its like a fat, bald guy buying a Porsche. Even sadder because its online gaming.

    First off you can get a single 7600GT at newegg for $90 after rebates. That then makes it $180 for 2 not $300. Secondly 2 7600gt's are about on par with a single 7900Gt which can be bought for $180 after rebates. Now if you look at it.. exactly why would you purchase 2 video cards instead of 1 to achieve the same thing IF you are going to purchase somthing right now? I can understand if you already have a 7600GT but, both cards came out around the same time. In the long run you'll being spending more in power consumption for the 2 cards vs the 1 card solution and taking up 2 slots vs 1.

    Secondly, 2 7600Gt's or a single 7900GT is not adaquate to play today games With Eyecandy unless you are infact happy with playing modern games at low quality. I personaly am not therefore I need something with more power.

    Look...
    http://www.anandtech.com/video/showdoc.aspx?i=2975&p=4
    http://www.firingsquad.com/hardware/...ance/page8.asp

    The diffrence is obviously there when there is eyecandy.

    Also the Survey is NOT for the majority of online gamers since the majority is in a case like WOW which is 8million vs 300k of Steam which consists of not just 1 title. The millions that play on the WoW server are playing just 1 game. Huge difference there pal. Therefore the survey is obviously not pointed towards the "majority".

    The only thing Im looking at is compatibility for the masses. Which will in turn, add to my entertainment. UT3 doesnt run well, Im off to another game, and the masses will join me.

    Again you are still considering Steam to be the majority as in the "masses" when it's not. Yes CS being old still retains the highest FPS online shooter by the masses who play FPS games but, that in no way means the masses in total for pc gamers since obviously 68k is not even a fraction of 8million which consists of the WoW users.

    WoW is more demanding than CS original btw.

    Leave a comment:


  • replied
    Bud, the cards you listed wouldnt reside in any system I would ever build, nor would advise anyone to get. I dont look at "Joe Bobs Discount Hardware" releases and say "Hey, thats a good deal," because it isnt. **** like that is highly bastardized and will not give you half the performance of a BFG, ATi or a PNY. Sure its cheap, but you get what you pay for.

    Of course Steam is the majority.
    Right now, there are 64,444 servers up, and 122,534 players online between HL and HL2 mods (Counterstrike), The next closest is BF2 with 5392 servers and 13334 players.

    Steam- 122,534 players
    BF2- 13,534 players

    Do you not see that over 100,000 more players are playing those games? I dont see how you can look at the numbers and come up with a different view. The only wild card in that list is WoW, which there arent any stats for on gamespy, but also allows for systems that are low midrange and above. Even if its not 50% accurate the Steam engine hold the majority of online players, and if WoW beats that, its just proving the point more..

    How many of those 3.5 million Chinese playing WoW do you think have a Intel quad core, Dual 8800s, 4gigs of RAM and DVD drives?

    Graphics and graphic guality arent everything. IMO, they hinder more than they help in terms of sales and player population which reflect entertainment value, continuing support, player made content, and community interest.

    Leave a comment:


  • replied
    You know, thats not exactly true. You guys are basing this on high end games of which there are only currently a few, not mainstream. If someone can get two 7600s for $300, or a 8800 for $500, what do you think they are going to choose? With the exception of very few games, the dual 7600s are going to be the better bet because most of the power of a single 8800 is a waste for current games. Add the hefty price tag of two brand name 8800s and I LMAO at people who get them when they simply arent needed. Its like a fat, bald guy buying a Porsche. Even sadder because its online gaming.

    First off you can get a single 7600GT at newegg for $90 after rebates. That then makes it $180 for 2 not $300. Secondly 2 7600gt's are about on par with a single 7900Gt which can be bought for $180 after rebates. Now if you look at it.. exactly why would you purchase 2 video cards instead of 1 to achieve the same thing IF you are going to purchase somthing right now? I can understand if you already have a 7600GT but, both cards came out around the same time. In the long run you'll being spending more in power consumption for the 2 cards vs the 1 card solution and taking up 2 slots vs 1.

    Secondly, 2 7600Gt's or a single 7900GT is not adaquate to play today games With Eyecandy unless you are infact happy with playing modern games at low quality. If not then go play a console.

    Look...
    http://www.anandtech.com/video/showdoc.aspx?i=2975&p=4
    http://www.firingsquad.com/hardware/...ance/page8.asp

    The diffrence is obviously there when there is eyecandy.

    Also the Survey is NOT for the majority of online gamers since the majority is in a case like WOW which is 8million vs 300k of Steam which consists of not just 1 title vs the millions to WoW which consists of 1 game. Therefore the survey is obviously not pointed towards the "majority".

    The only thing Im looking at is compatibility for the masses. Which will in turn, add to my entertainment. UT3 doesnt run well, Im off to another game, and the masses will join me.

    Again you are still considering Steam to be the majority as in the "masses" when it's not. Yes CS being old still retains the highest FPS online shooter by the masses but, that in no way means the masses in total for pc gamers since obviously 68k is not even a fraction of 8million of the WoW users. WoW is more demanding than CS original btw.

    Leave a comment:


  • replied
    You know, thats not exactly true. You guys are basing this on high end games of which there are only currently a few, not mainstream. If someone can get two 7600s for $300, or a 8800 for $500, what do you think they are going to choose? With the exception of very few games, the dual 7600s are going to be the better bet because most of the power of a single 8800 is a waste for current games. Add the hefty price tag of two brand name 8800s and I LMAO at people who get them when they simply arent needed. Its like a fat, bald guy buying a Porsche. Even sadder because its online gaming.

    You guys can post the power and capibilities of this and that. In two years, when that power is fully utilized, and the GF9800 is out and your current 8800s are midrange, another thousand or so dollars in un needed uprgrades will happen again and the same 10% of gamers willl happily fork out the cash for high end systems and make the same arguments.

    I know all hobbies are expensive. However, this isnt life and death ****.

    The only thing Im looking at is compatibility for the masses. Which will in turn, add to my entertainment. UT3 doesnt run well, Im off to another game, and the masses will join me because the masses have only a lil above, or a range below my system specs.

    Leave a comment:


  • replied
    Most people with SLI buy two top of the line, or near top of the line, cards at release to take advantage of it right then... because everybody knows a couple generation laters later you can get a single card that spits out better exact performance and uses less power.

    The point of SLI is to drive extremely high resolutions in some demanding games because LCD's have a native res.

    Once your cards are not capable of pushing the res it's off to ebay and then time to upgrade and buy two more cards.


    I agree with you tottaly here however utlizing SLI with old hardware isn't going to be of any benefit at all. As it is you can have dual 8500's in SLI but, for what reason when you can have a single 8600GTS at the least? Its my point of course when using 2 6800Ultras for SLI. I can only see the use of SLI when using a dual 8800GTS or dual 8800GTX/Ultra. 2 7900Gtx is about the same as 1 GTS or GTX so its also pointless to go SLI with even that.

    Wait one second there, it depends on the type of game. Quite often the most "hardcore" FPS players throttle down details and resolution even though they have a PC capable of maxing out the game simply because it offers an inherent in game advantage of being less distracting.

    This is deffinently a subjective concept on how distractive someone can become with the surrounding objects. I can only see a lower resolution just to gain faster frame rates but, not that it helps at all in terms of distraction.

    Leave a comment:


  • replied
    Originally posted by roadrash View Post
    Ahh a side track derail into DRM fiascos.

    If you don't like it then don't buy it. Nobody is forcing you to buy HD-DVD, blu-ray, music or any of that. Just don't, why bother it's all **** anyways.

    And before you fire off on a "well that god **** MS they are causing it" think again. Blame the MPAA, blame the artists, blame the people who are buying it, and then blame the likes of apple and MS who have to play ball with a system that other people created. It's not like DRM doesn't already exist outside of vista on a draconian level.

    It's not like you can't get around all of that anyways.
    I'm not side tracking. I brought it up because when I said I did not like vista and would not use it people started accusing me of outrageous things like not wanting to move forward, didn't want to pay for it, to lazy to upgrade, etc... And thats just simply not true. I am not placing the blame entirely on MS, because you're right the RIAA and MPAA put a lot of pressure on them, but that does not mean MS gets off easy in my book. Apple is NOT pushing DRM in the way MS is. They are openly against it, and not just a few days ago created iTunes plus, where the songs cost like $0.20 more, but they are DRM free and offered in higher bit rates. Same thing with their OS. The latest OS, Tiger, has no kinds of DRM, no activation, or any bull or HD content restrictions like that, and neither will their upcoming OS which I forget the name of right now. And please dont say "well then why don't you stfu and switch to apple". You (everybody here) know **** well why (for the thick-skulled: not enough game support and a pretty newbie-type interface) so don't even bother.

    Leave a comment:


  • replied
    Yes it most deffinently is bragging rights to say that you have SLI when using 2 OLD cards when 1 card of today can equal its performance or exceed it. Its utterly foolish to buy another of the same old card when a single card can exceed the performance. Expecially when you can save money in the long run in power.
    Most people with SLI buy two top of the line, or near top of the line, cards at release to take advantage of it right then... because everybody knows a couple generation laters later you can get a single card that spits out better exact performance and uses less power.

    The point of SLI is to drive extremely high resolutions in some demanding games because LCD's have a native res.

    Once your cards are not capable of pushing the res it's off to ebay and then time to upgrade and buy two more cards.

    If you feel content to play a today game at lowest grafic settings then why are you playing on the PC? I can understand medium at the least but, how exactly can you consider yourself a pc gamer when you are playing today games at lowest settings at low res? Its ok if you aren interested in just 1 or 2 games like I am who doesnt want to upgrade that is playing at lowest grafic settings since its not worth to upgrade because I myself do other things with my pc besides game. However if you are a true PC gamer.. I have yet met 1 person that plays hardcore at lowest grafic settings. Even occasional gamers I've yet to see play at lowest settings.
    Wait one second there, it depends on the type of game. Quite often the most "hardcore" FPS players throttle down details and resolution even though they have a PC capable of maxing out the game simply because it offers an inherent in game advantage of being less distracting.

    Leave a comment:

Working...
X