Staredit Network

Staredit Network -> Games -> Gaming Consoles
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 16:49:54
I don't know what three or so members are going to do, and personally, I do not think alone it would have enough activity to warrant for one, although I would be interested in helping where I can ~.~. I think games should be merged into "Games and Technology" or something, because as of right now this is really the only thread..:\
Report, edit, etc...Posted by dumbducky on 2006-11-28 at 17:01:05
I disagree. Programming should me expanded to Computers General, to include tech and software stuff.

I'd sign the petition for a tech forum.
Report, edit, etc...Posted by Cole on 2006-11-28 at 17:01:53
QUOTE
Maybe the Xenos can handle AA better, but RSX has an advantage with shaders! ~75 billion operations/sec compared to ~48 billion operations/sec. What I cannot understand though is how the RSX achieves that number with less shader operations per clock when the clock speeds are equivalent. Maybe I'm just missing something... Also, the Xenos is a custom chip by ATI while RSX is based on the (not G70 as I believed) G71 architecture. Generally ATI chips are better at shader performance while NVidia chips are better at AA, but this blows that conception away (with the places switched)! I was just reading that article, and it says the Xenos can achieve 96 billion shader operations/sec, but it's actually half that at 48 billion currently. If you want to, though, you can research that and find out why there are differences. Hmmm, this also says 48 billion. The wiki says the theoretical max is 96 billion. I bet that the site you posted just counted the theoretical max for the 360 and not the PS3. Deathawk, thanks for correcting me on that resolution issue. I had just heard that the 360 originally could not do 1080i/p, but obviously that's outdated now. I don't know if I said it before, but at high resolutions, shader performance is more important to image quality than AA. No one will dispute that it's better to have both though. However, can we agree that both consoles look kickass at top resolutions (even though they're not remotely comparable to 8800GTX in SLI)?

*Edit* Guys, can we have more technology discussions like this? Lol.

He went through how he got his shader numbers. I'll quote them.

QUOTE
What I found interesting was Microsoft said the 360’s GPU could perform 48 billion shader operations per second back in 2005. However Bob Feldstein, VP of engineering for ATI, made it very clear that the 360’s GPU can perform 2 of those shaders per cycle so the 360’s GPU is actually capable of 96 billion shader operations per second.


The architecture is simply more efficient. Like an AMD 64 vs P4. While the P4 may be able to get that speed up to 4ghz, the 64 line was just so much more efficient that it could get the same performance with 2.6ghz - 2.8ghz by doing more per cycle. The same applies here.

For actual math here it is for the 360:
QUOTE
"On chip, the shaders are organized in three SIMD engines with 16 processors per unit, for a total of 48 shaders. Each of these shaders is comprised of four ALUs that can execute a single operation per cycle, so that each shader unit can execute four floating-point ops per cycle."

48 shader units * 4 ops per cycle = 192 shader ops per clock
Xenos is clocked at 500MHZ *192 shader ops per clock = 96 billion shader ops per second.


here it is for the Ps3:
QUOTE
# The RSX has 24 pixel pipes (each of which performs 5.7 ops) 5.7ops *24 Pixel Pipelines=136.8 shader ops per clock.

# The RSX is clocked at 550MHZ *136 shader ops per clock =74800 (or 74,800,000,000)

Both numbers represent peak performance.

Also keep in mind that the 360 GPU uses a unified system for pixel and vertex shaders. The RSX does not. This really makes the 360 GPU much more efficient. The RSX cannot swap vertex or pixel shaders. This creates for a much more efficient system for the 360 GPU and is to be the future of GPU computing.
(Nvidia's new card has it, ATI's upcoming R600(very similar to the 360 GPU) also will have it)

QUOTE
However, can we agree that both consoles look kickass at top resolutions (even though they're not remotely comparable to 8800GTX in SLI)?

Definently. In fact not even one 8800GTX(Which is faster than 2 7800GTX's in SLI).
Although ATI's R600 has really got me goin... smile.gif
I'm going to skip this generation of graphics cards. First iteration of Dx10 and there mostly looking at Dx9 performance rather than Dx10 anyway. Plus my 7800GTX will last me for some time.

The R700 is taking a multi-core architecture(I believe 3 cores). It's an overall more efficient and cheaper design.
Report, edit, etc...Posted by Felagund on 2006-11-28 at 17:10:03
Then why does the official XBox website list it as being able to only do 48 billion shader operations per second? I'll be honest - I'm not that interested in the whole PS3 vs. XBox debate nearly as much as I am in R600 vs. G80. How can the Xenos do 2 shader (mini-operations) per actual operation? I don't understand heh heh.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 17:10:11
There is really never a good time to buy technology, I don't think you'll be that bad off if you were to buy a g80 or r600, you're going to wait some more, until they start up on the r800, then you wait for that, might as well buy now if you want high performance .. ~.~... Hardware is getting more powerful, and software is lagging behind, so you'll still be fine with a g80 right now, will be able to do just about everything for a long time.. tongue.gif
Report, edit, etc...Posted by Cole on 2006-11-28 at 17:24:41
QUOTE
Then why does the official XBox website list it as being able to only do 48 billion shader operations per second? I'll be honest - I'm not that interested in the whole PS3 vs. XBox debate nearly as much as I am in R600 vs. G80. How can the Xenos do 2 shader (mini-operations) per actual operation? I don't understand heh heh.

It can do 2 per clock. How? I do not actually know the extremely in-depth guy-who-designs-the-gpu view. As I am more of a networker\programmer\comp builder. However it's nothing new. It's simply making things more efficient. AMD did this to completely annihilate the P4 processors until Intel got off there ass and made Core 2 DUO(Which runs at similar clock speeds of AMD's and works for efficiency).

http://www.beyond3d.com/articles/xenos/ is one of his sources that goes fairly in depth. I remember reading it a long time ago but nothing recent.
But here are all his sources: http://dpad.gotfrag.com/portal/story/35372/?spage=10

EDIT:
I will try to explain it.
You have 48 shaders. However each shader contains 4 ALU's. Each ALU does one shader operation every cycle. so in order to get how many shaders that are done every cycle we must do 48 * 4 = 192.
Now how many cycles per second? 500,000,000.(500mhz)
500,000,000 * 192 = 96,000,000,000
(end edit)

Microsoft did state it did 48 billion, although ATI stated differently. This site gave an hypothesis on why Microsoft didn't go with ATI's numbers...(This would also say why the official Xbox magazine has this)
QUOTE
Did Microsoft just make a mistake or did they purposely misrepresent their GPU to lead Sony on? The 360’s



This GPU will be very similar to that of the R600 and work basically the same way(Different clocks, no EDRAM, but the architecture is mostly the same). The R600 should be able to beat the 8800GTX. So take the R600, under clock it, give it half the memory, and be sure to have a 128bit interface, give it EDRAM and you'll have a similiar card to the 360's GPU. It will defienently be able to outclass a 7800GTX.

This is what ATI said about the GPU. ATI would be the one to know.
"On chip, the shaders are organized in three SIMD engines with 16 processors per unit, for a total of 48 shaders. Each of these shaders is comprised of four ALUs that can execute a single operation per cycle, so that each shader unit can execute four floating-point ops per cycle."
Report, edit, etc...Posted by Felagund on 2006-11-28 at 18:24:12
Heh, the R600 is 512-bit. It has half the unified shaders of the G80, but it can do 128 shader operations per cycle (same as G80). I've heard that it may have up to 2 GB (though most rumors point to "only" 1 GB) of GDDR4 memory, which is certainly better than the 8800GTX's 768 MB of GDDR3 memory. I'm placing high hopes on the R600 so maybe I can "survive" with a single R600 as opposed to an SLI set up with G80 (you'd be surprised at how much $650 is still worth).
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 19:01:07
Yeah, the Inquirer did say that the r600 would have GDDR4 and 512bit memory bus, but it IS the Inquirer after all, they have been wrong a bunch of times before. Who knows right now, wait till it's out tongue.gif
Report, edit, etc...Posted by Felagund on 2006-11-28 at 19:07:01
The X1950XTX used GDDR4, so what in Hell would make DAAMIT (ATI+AMD tongue.gif ) go back to GDDR3 except on mainstream cards?
Report, edit, etc...Posted by Cole on 2006-11-28 at 19:44:48
QUOTE
Heh, the R600 is 512-bit. It has half the unified shaders of the G80, but it can do 128 shader operations per cycle (same as G80). I've heard that it may have up to 2 GB (though most rumors point to "only" 1 GB) of GDDR4 memory, which is certainly better than the 8800GTX's 768 MB of GDDR3 memory. I'm placing high hopes on the R600 so maybe I can "survive" with a single R600 as opposed to an SLI set up with G80 (you'd be surprised at how much $650 is still worth).

I believe it is able to support up to 2gigs. However that will be reserved for heavy workstation use and they may be looking to utilize this architecture for future cards. Maybe transport this architecture to what is ATI's Open GL workstation stuff.. FireGL????

Currenty what 3 games tops use the 512megs of memory?(Although with Crysis and Alan Wake coming along....). Doom 3 for maximum texture quality...maybe Quake 4?? and maybe FEAR???? So I'm not sure if ATI would go straight to 1gig, I think they will go to 768 just to stay with the 8800 and keep costs down because 1gig at this point is just.....unneeded.

What I would really like to see is a massively improved SLI\CrossFire system. Rather than having the same memory in both memory cards I would like to see a system developed for both grpahic cards to access eachothers. This way you could double the memory with an SLI\Crossfire setup.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 20:03:07
Yeah, it's FireGL.

Yeah, as I've said software is lagging behind hardware.

I think having a choice to whether getting the 768 or 1gig is a better option. A few cards have that choice now, 256 or 512mb, and it's a nice thing to have.


I would like to see a more defined Crossfire or something... some cards require mastercards, some don't etc.... should make it so you just get two of the same cards, and make it no need for a mastercard, like how SLI works, but I'm not sure if they can actually even do that, would be nice if they could..
Report, edit, etc...Posted by Felagund on 2006-11-28 at 20:37:18
In SLI, the cards render separate but equally sized parts of the frame, while in CrossFire you have a bunch of choices (the SLI type approach, or splitting the screen up into rectangles to render, or one card renders odd frames while the other renders even frames). I could see where shared memory would be useful in multi-gpu solutions, except really in alternate frame rendering. However, SLI is a more mature technology than Crossfire, and with far more support, likely to receive more funding and improvements. Right now ATI can't compete with the nForce 680 northbridge chipset.
Report, edit, etc...Posted by Cole on 2006-11-28 at 21:33:54
QUOTE
would like to see a more defined Crossfire or something... some cards require mastercards, some don't etc.... should make it so you just get two of the same cards, and make it no need for a mastercard, like how SLI works, but I'm not sure if they can actually even do that, would be nice if they could..

With the R600 I believe CrossFire has finally matured to that stage. I believe they finally were able to get rid of all that nonsense. At least thats what I read over at the Inq.


QUOTE
However, SLI is a more mature technology than Crossfire, and with far more support, likely to receive more funding and improvements. Right now ATI can't compete with the nForce 680 northbridge chipset.

I bet Nvidia is pretty happy that they bought up 3DFX.

To bad 3DFX didn't survive. I mean they had the first real accelerated graphics API, FSAA(?), and SLI.

I also wonder if S3 will ever be able to make a mainstream card.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 21:39:31
Good, it's really a pain the way it is now.



And yeah, nVidia buying the SLI technology was a good idea for them. But personally, I don't think SLI is that good. You will really never need more than one of the higher end cards at a time. When new cards come out, just keep getting one of them at a time...
Report, edit, etc...Posted by Felagund on 2006-11-28 at 21:48:48
SLI really shows its strength at 2560x1600, even with the 8800 GTX. biggrin.gif 8800 GTX in SLI = able to handle anything in the foreseeable future.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 21:56:17
I know it's better for higher resolutions, but who actually runs their monitor at that, or has a monitor that actually supports those resolutions. Not many people. Also, a lot of people also get SLI when they don't really run at high resolutions.
Report, edit, etc...Posted by Mp)7-7 on 2006-11-28 at 22:49:17
I think that my computer is fine with 1280x800 and my friends all tell me they have trouble seeing things at that size. Its 15.4 Widescreen laptop. I think it is the best viewing size possible. Any other size looks wierd to me now.
Report, edit, etc...Posted by Cole on 2006-11-28 at 22:55:34
1280 x 800???? Thats just a weird resolution. Try 1280x1080. Thats much more common and will give you some more width space on your desktop.

I use an odd 1600x1024 but only because my monitor is gay and wont support 1600x1200. 1600x1024 is simply the highest I can push it. I need a new monitor..
Report, edit, etc...Posted by Mp)7-7 on 2006-11-28 at 22:59:37
Its because it is widescreen like I said before. Anything else looks retarded. Ya I know its different but with the widescreen it looks real nice.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 23:17:51
I have this monitor running at 1600x1200 @ 80hz. It's a pretty sweet monitor, CRT for CAD, I think it's worth like 800 dollars new/400 dollars used online. It's image quality is awesome, 21 inches I think.

I have two, it's going to be pretty awesome, but I'm going to have to pick up a DVI to VGA adapter which I lost, which hasn't allowed me to actually use my monitors >.< but they're only like 50 cents, I'm just lazy.
Report, edit, etc...Posted by Cole on 2006-11-28 at 23:27:42
QUOTE
I have two, it's going to be pretty awesome, but I'm going to have to pick up a DVI to VGA adapter which I lost, which hasn't allowed me to actually use my monitors >.< but they're only like 50 cents, I'm just lazy.

Send me one!!!
I use to run 2 monitors but my desktop space is somewhat limited(I actually just fixed that today). Worked real well for programming. Could have a file format that im working with on 1 screen and the IDE on the other aswell as a bunch of other neat stuff.
Report, edit, etc...Posted by Deathawk on 2006-11-28 at 23:37:05
Well, these things are seriously huge, shipping would be a ton, I get them for free though. They're CRTs, so what do you expect tongue.gif


Here is a pretty in depth review on the monitors.

http://www.gamepc.com/labs/view_content.as...cookie%5Ftest=1

Read a few paragraphs down.. "The overall size of the monitor is, for the lack of a better word, huge"

This article was written in 1999, but this is still one kickass monitor tongue.gif
Report, edit, etc...Posted by JoJo. on 2006-11-30 at 15:29:16
I could really not care about how graphics are. The Xbox and PS3 are both just like computers, but the Wi on the other hand. Is something we haven't really seen before, the price is well worth it. The games are also cheaper than Xbox and PS3, I don't see why people would ever say that the PS3 or Xbox graphics makes one system better than another. Comparing them by that factor really makes all consoles about is graphics...I've played Zelda: Twilight Princess for the Wii and those are damn good graphics, maybe your right...PS3 and Xbox do have better graphics, but I don't have some graphics fetish. Anyway you guys should compare things like what games come out for the console, price, quality, and not just graphics please.
Report, edit, etc...Posted by Cole on 2006-11-30 at 18:23:54
QUOTE
Anyway you guys should compare things like what games come out for the console, price, quality, and not just graphics please.

We stopped comparing consoles quite early on actually. For the most part we were comparing technology in the GPU's of the consoles.

The argument leading to that was about the cpu/gpu technology and what one was faster. We wern't talking about anything else and you should not take it as if we are. Of course it is the games that mean everything. Thats why you should buy a console.

Simply when it comes to what console is best theres no right answer. It is an opinion based on the games that you like for that console. It would be a dumb debate. The only thing you could try to play off is that right now the 360 has a larger library. However a counter argument could be that no 360 games interest you. Therefor simply theres to much opinion in such a debate.
Report, edit, etc...Posted by Neiji on 2006-12-01 at 12:34:35
Wait, I can only get ONE console... I'm making a decision between the 360 or the Wii... WHICH ONE?!
Next Page (4)