Announcement

Collapse
No announcement yet.

What texture/character detail setting is the PS3 running at do you think?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Fair enough. Given large LCDs are just an extension of computer monitor production (hence 768 width is common, and not 720), you'd have thought most would be ok with VGA.

    You can always use PowerStrip to alter screen size and location.

    Comment


      #17
      So much confusion in this thread, hopefully I can help:

      (1) Consoles are usually played on TVs that sit several feet from the player, so anti-aliasing and anisotropic filtering aren't that big of concerns for developers because most people will not notice sitting at a distance.

      (2) The PS3 outputs a resolution of 1280x720 and then scales that to 1080 if so desired (unless this has changed). I have often played the game with my PC hooked up to a 50" LCD Sony HDTV, and it looks great at 1920x1080, but I've also played such games at higher resolutions (2048x1536, etc.). The PS3 and Xbox360 would have had to be more expensive to have the power to push these kinds of games at 1920x1080 (though some do). Developers have decide if they want to have a 30hz game or 60hz game, and now, if they want their maximum resolution to be 1280x720 or 1920x1080. You can have some great graphics at 1280x720 that the console just couldn't do at 1920x1080. As Cliffy B said when someone asked him if Gears of War was going to be running at 60fps, it plainly put that it just isn't possible with the hardware. Gears of War looks great, and several things would have had to be toned down/removed in order to get it to run at 1080p on a console, instead, you can just scale it to 1080p, which still looks great.

      (3) The GPU in the PS3 is similar to the 7800gtx GPU from Nvidia. However, on a console, you're dealing with a closed environment, so the developer can focus on the hardware, and squeeze as much out of it as possible. Therefore, the more time a developer has with a console, the better he can make his games look. The Cell processor in the PS3 is nice but however it's complicated. You basically throw one main set of instructions at it, but then you have to break up the rest into small chunks for the rest of the Cell, so it's not the easiest thing to deal with, and Tim Sweeney already commented on it several times in the past,
      Yeah, well, programming for multicore architectures is hard. If programmers had our way, we’d just program single-thread application forever, because it’s much easier. But it’s also clear that there’s an irreversible hardware trend towards multicore, because that’s the only way to deliver maximum power economically. You just can’t make ever higher gigahertz rates—at some point your CPU becomes a microwave and melts your computer. Multicore is here to stay.

      What we’ve found in this generation—and here are some scary numbers—is that writing an engine system designed for multicores, that can scale to multiple threads efficiently, takes about double the efforts as single threaded. It takes double the design effort, implementation effort, lifetime support effort, debugging…all the costs metrics multiplied by a factor of two for multicores. That’s pretty expensive, but ends up being bearable.

      Whereas, some of the other hardware trends are even worse than that, like programming for Cell, we found, had a [five-times] productivity divisor. It’s five times harder and that really starts to hit…you have to question whether it’s economically viable for mainstream developers to put real effort into it at that point. And then GPUs are trying to take a non-graphics algorithm and run it on the graphics processor currently. Given the limitations of those languages, we found that the multiplier there is 10x or more, which is well out of the realm of economic viability.
      http://ve3d.ign.com/articles/news/37...Talks-Xbox-360

      Hopefully this clarifies somethings.

      Comment

      Working...
      X