HardwareBanter

HardwareBanter (http://www.hardwarebanter.com/index.php)
-   Nvidia Videocards (http://www.hardwarebanter.com/forumdisplay.php?f=16)
-   -   The next Unreal engine... (http://www.hardwarebanter.com/showthread.php?t=53435)

John February 26th 04 06:46 PM

The next Unreal engine...
 
A good interview with Tim Sweeney on the development of the future
Unreal 3 engine:

http://www.beyond3d.com/interviews/sweeney04/


He says "...we're going to make a game that brings today's GeForce
FX's and Radeon 9700+'s to their knees at 640x480! :-) We are
targetting next-generation consoles and the kinds of PC's that will be
typical on the market in 2006, and today's high end graphics cards are
going to be somewhat low end then, similar to a GeForce4MX or a Radeon
7500 for today's games".

I also like the part where he says he wishes the Intel integrated
graphics chip would just "go away."

faster_framerates February 26th 04 07:07 PM

I'm sorry, but what is the benefit of excluding a market of consumers with
average video cards?

How about an engine that runs great and looks beautiful on a large range of
systems? I'm all for progress and a more cinematic look, but Joe Consumer
shouldn't have to upgrade his computer every six months and stay on top of
hardware issues just because he wants to play the latest game release.

This is why people settle for consoles.

- f_f



"John" wrote in message
om...
A good interview with Tim Sweeney on the development of the future
Unreal 3 engine:

http://www.beyond3d.com/interviews/sweeney04/


He says "...we're going to make a game that brings today's GeForce
FX's and Radeon 9700+'s to their knees at 640x480! :-) We are
targetting next-generation consoles and the kinds of PC's that will be
typical on the market in 2006, and today's high end graphics cards are
going to be somewhat low end then, similar to a GeForce4MX or a Radeon
7500 for today's games".

I also like the part where he says he wishes the Intel integrated
graphics chip would just "go away."




Frank February 26th 04 07:44 PM

i´d love to see the gameplay evolving as much as they plan
to do with the grafix. i´m not going to buy me one of those cards
just to play another bunnyhopping fragfest. jaw drops quickly
with grafix and fx, but the need to play the game again and
again has as much or more to do with a decent gameplay.

cu


frank



[email protected] February 26th 04 08:04 PM

In alt.comp.periphs.videocards.ati faster_framerates wrote:
I'm sorry, but what is the benefit of excluding a market of consumers
with average video cards?


They are targeting average video cards, it's just that they're targeting
the average cards of 2006. It will be closer to 2006 when the engine is
finished, and targeting even high-end cards of 2003 in a project that
starts in 2004 is just a waste of time and money. A 24-month upgrade
cycle is not completely unreasonable for videogames.

This is why people settle for consoles.


I take issue with the "settle", but that's another argument for another
time.

-a

John Lewis February 26th 04 09:11 PM

On Thu, 26 Feb 2004 19:44:59 +0100, "Frank" wrote:

i´d love to see the gameplay evolving as much as they plan
to do with the grafix. i´m not going to buy me one of those cards
just to play another bunnyhopping fragfest. jaw drops quickly
with grafix and fx, but the need to play the game again and
again has as much or more to do with a decent gameplay.


And in 3D open-space games such as H&D2, Far Cry etc,
a whole lot to do with AI as well. Replaying a level with
scripted AI, and knowing exactly where the enemy will
appear and how they will react gives zero re-play value.
So a vote for "intelligent AI", such as that in Far Cry,
H&D2 also seem to have elements of intelligent AI.

And, of course there is no substitute for clever and
interesting level-design with unexpected game-play
and plot twists having multiple endings, such as
Deus Ex 1.

John Lewis

cu


frank




Kevin C. February 27th 04 12:41 AM


wrote in message
...
In alt.comp.periphs.videocards.ati faster_framerates

wrote:
I'm sorry, but what is the benefit of excluding a market of consumers
with average video cards?


They are targeting average video cards, it's just that they're targeting
the average cards of 2006.


That's somewhat contradictory with his statement that he wishes the Intel
video chips would go away. Whether you choose to believe it or not, most
people do not own high end GPUs today, nor will they tomorrow. Even above
the casual gamer, there are many folks who are still running GF2-era
devices. In 2006 I imagine that just as many people will still be running
the GF4s and Radeons that are in their computers today, the same cards that
Mr. Sweeney has targeted to exclude.



Bora Ugurlu February 27th 04 12:53 AM

Am Thu, 26 Feb 2004 23:41:36 GMT, "Kevin C."

That's somewhat contradictory with his statement that he wishes the Intel
video chips would go away. Whether you choose to believe it or not, most
people do not own high end GPUs today, nor will they tomorrow. Even above
the casual gamer, there are many folks who are still running GF2-era
devices. In 2006 I imagine that just as many people will still be running
the GF4s and Radeons that are in their computers today, the same cards that
Mr. Sweeney has targeted to exclude.



He meant to say 'at maximum detail level'.. Then the high end cards
would come to a crawl. Not many people play with details maxed out. So
they get in, say, Ut2k4 decent framerates with a Ti4200 (which I
have). If I turn on all the details with 4xAA and 8x Anisotropy then
it's a slide show. That's what he meant.

K February 27th 04 02:08 AM


"Kevin C." wrote in message
om...

That's somewhat contradictory with his statement that he wishes the Intel
video chips would go away. Whether you choose to believe it or not, most
people do not own high end GPUs today, nor will they tomorrow. Even above
the casual gamer, there are many folks who are still running GF2-era
devices. In 2006 I imagine that just as many people will still be running
the GF4s and Radeons that are in their computers today, the same cards

that
Mr. Sweeney has targeted to exclude.


Well sucks to be them. It's about time software started pushing the limits
of hardware again. There was a time when people were very happy to get
30fps from Quake 2. Now all you see is people concerned that they are only

getting 90 fps in UT2003, etc. If in 2006 people still choose to hold on to
their GF4s and Radeons they are going to be left out on new titles, and they
only have themselves to blame. You cannot expect the software developers to
stand still for the benefit of those who are unwilling to upgrade.

There has only been two occasions when I've installed a gfx card and said
'wow' to myself. The first was playing Unreal and Q2 on a Voodoo 2, the
other was after I got a GF3 and seen all the Q3 engined games in high-res
with all the candy. All the cards since then have only done what the GF3
did, just faster. In other words there has been little in the way of
innovation. What has been long overdue in the graphics industry is a next
'wow' card.


K



rms February 27th 04 02:57 AM

And in 3D open-space games such as H&D2, Far Cry etc,
a whole lot to do with AI as well. Replaying a level with
scripted AI, and knowing exactly where the enemy will
appear and how they will react gives zero re-play value.


Playing Vietcong I'm continually astounded in the variety of AI
placement and behavior that this game offers. For instance I'm now trying
to complete one of the quickfights (Arroyo). It's quite difficult, and I've
restarted the level literally dozens of times, and each time the initial AI
placement and behavior is slightly different. Very impressive.

rms



Andrew February 27th 04 07:59 AM

On Fri, 27 Feb 2004 01:08:25 -0000, "K" wrote:

There has only been two occasions when I've installed a gfx card and said
'wow' to myself. The first was playing Unreal and Q2 on a Voodoo 2, the
other was after I got a GF3 and seen all the Q3 engined games in high-res
with all the candy. All the cards since then have only done what the GF3
did, just faster. In other words there has been little in the way of
innovation. What has been long overdue in the graphics industry is a next
'wow' card.


Far Cry on a 9700 Pro graphics card gave me a "wow". Even seeing the
rain on water in Morrowind in a GF4 was a "wow" moment for me. There
has been a lot of innovation in hardware and software since the GF3.
--
Andrew. To email unscramble & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevent text.
Check groups.google.com before asking a question.


All times are GMT +1. The time now is 01:18 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
HardwareBanter.com