Author Topic: ATI vs nVidia  (Read 15980 times)

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #50 on: 12 December 2010, 21:48:34 »
Quote
Quote from: Omega on Today at 01:52:21
Best I tried was a over-maxed Oblivion... Did you know the INI values can exceed the ingame settings?.

I do, and i bet arch does too, crysis can have some huge graphic changes from using the INI or file, i dont remember what it is called, but you can change specularity, normal map density, strenght, water reflection quality, and so on.
Hunter mod is best, beautifull scenery and all.
Yea, I've been modding up Crysis a ton, working on ragdoll properties now. :P I really don't get why Crysis was release with such overdone ragdoll physics, you shoot someone with the scar and they take flight!


Just don't plan on gaming on one of these.  I actually got to play around with one the other day, as my landlord just got one for a not-very-tech-savvy friend.



Ouch.....
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #51 on: 12 December 2010, 21:50:54 »
Just don't plan on gaming on one of these.  I actually got to play around with one the other day, as my landlord just got one for a not-very-tech-savvy friend.


Hey that looked awesome! is that the millenium version of W2k? oh man so nostalgic i want it!

John.d.h

  • Moderator
  • Airship
  • ********
  • Posts: 3,757
  • I have to go now. My planet needs me.
    • View Profile
Re: ATI vs nVidia
« Reply #52 on: 12 December 2010, 22:03:31 »
Hey that looked awesome! is that the millenium version of W2k? oh man so nostalgic i want it!
No, that Windows ME you're thinking of.  CE is like a mobile version of Windows 98.  It actually runs pretty fast, which should be obvious when you think about the fact that computers with 128MB of RAM were just as fast as the computers we have now, before operating systems got so bloated that you need a gig just to boot up. :|  For what most people use their computers for on a daily basis (email, facebook, news, kitty pictures), it does it fairly well.  Of course, it's not compatible with much modern software, and it's got like 2GB solid state drive, so you're basically stuck with what comes with it, including Internet Explorer, unless you find some bare-bones *nix that will run on it.

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #53 on: 12 December 2010, 23:22:43 »
I just need to find Mozaik and then im good to go huh? :O

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #54 on: 18 December 2010, 09:48:39 »
We have a new 850W gaming power supply for my incoming 5870, so far things are grim............piece of ****......... May have to make use of that warranty.
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #55 on: 18 December 2010, 12:12:38 »
5870 doesn`t consume more than what a 550W can handle anyways :D

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #56 on: 20 December 2010, 07:12:58 »
DX10 games < DX11 games ;)

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #57 on: 20 December 2010, 11:17:15 »
Someday Gabbe.......someday, I'm gonna catch up to you! :P

Let me guess you have 5870X2's in QUAD CrossfireX? A highly overclocked 6 core processor? That's the most powerful rig possible to my knowledge...
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #58 on: 20 December 2010, 14:13:53 »
actually you can have 6890s overclocked with watercooling on all, and you can have TWO 6 core processors on special made Motherboards ;) Meh you can try, but u won\t cuz me got best rig evah!!11>! :D

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #59 on: 20 December 2010, 20:47:18 »
Dude, I regret to inform you that the 5870X2 is more powerful than the 6890, because it's two cards on top of each other. Most of the stats I find seem to be all over the place, but I've very sure that the 5870 is still ATI's singlecard beast. 8)
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #60 on: 21 December 2010, 10:42:10 »
sorry, typo, 6970 is best when overclocked. And yeah, it is out on some markets :angel: :angel: I haven`t compared nVidia 585/580s yet so unsure there.. AND im pretty sure that 5870X2 is just a overclocked 5970 since the 5970 IS two 5870s on top just donwclocked to match two 5850s.

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #61 on: 21 December 2010, 11:03:25 »
5870X2: 2 separate cards on top of each other working as one.
5970:    1 card with two 5870 cores, underclocked.

Anyway, look at this: http://unlimiteddetailtechnology.com/home.html

It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #62 on: 21 December 2010, 11:35:03 »
Ok.

Germany is getting a new supercomputer, with processingpower of 3PETAFLOPS, but soon in 2011 USA will be getting one that can process 20 PETAFLOPS. I want one PETAbyte of HDD!

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #63 on: 21 December 2010, 12:20:27 »
Ok.

Germany is getting a new supercomputer, with processingpower of 3PETAFLOPS, but soon in 2011 USA will be getting one that can process 20 PETAFLOPS. I want one PETAbyte of HDD!

Awesome!
Egypt Remastered!

Proof: Owner of glest@mail.com

John.d.h

  • Moderator
  • Airship
  • ********
  • Posts: 3,757
  • I have to go now. My planet needs me.
    • View Profile
Re: ATI vs nVidia
« Reply #64 on: 21 December 2010, 13:07:23 »
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.
Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere?  How is there anything at all to like about that?

(I guess if hardware stops becoming obsolete as fast, manufacturers will just have to make it unreliable, so it breaks more often. ::))

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #65 on: 21 December 2010, 13:32:27 »
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.
Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere?  How is there anything at all to like about that?

(I guess if hardware stops becoming obsolete as fast, manufacturers will just have to make it unreliable, so it breaks more often. ::))

There is a certain joy in having a powerful computer, you have a bunch of weak laptops so you haven't really experienced it yet. But it's all the graphics competition and the fun of graphics. This new thing is going to piss away all that fun, there will be no more satisfaction in having a powerful computer. Because someone on their smartphone will be able to run a game like Crysis no problem. Crysis only has good graphics because it's just about the best there is, but if you made everygame with CryEngine2 just like Crysis, how awesome would Crysis graphics be? Not at all. In effect it takes down a major drive for game developers and signifigantly weakens the gaming industry. The whole point of games is FUN! Universally good graphics won't be fun, it'll be dull and un-innovative. It's anti-creative, no more need for innovative programming, no more need to design for performance, just blow it up with polygons, nobody cares.

Gabbe what's your take on this, you obviously know how fun it is to have a powerful computer?
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #66 on: 21 December 2010, 14:16:31 »
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.
Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere?  How is there anything at all to like about that?

(I guess if hardware stops becoming obsolete as fast, manufacturers will just have to make it unreliable, so it breaks more often. ::))

What is the fun in your art? What si the fun in whatever hobby you have? What is the fun in good tasting food? What is the fun in wine? What is the fun?

Omega

  • MegaGlest Team
  • Dragon
  • ********
  • Posts: 6,167
  • Professional bug writer
    • View Profile
    • Personal site
Re: ATI vs nVidia
« Reply #67 on: 22 December 2010, 07:21:12 »
Actually, that unlimited detail sounds amazing arch! A way to "compress" the rendered graphics in real time to display graphics will poly counts which would be otherwise impossible? Then even the worst graphics cards could do decent on modern games! Of course, there's a lot of work to go, but that could very well be something of the future of gaming graphics.... If only I could render that sample screenshot realtime... :o
Edit the MegaGlest wiki: http://docs.megaglest.org/

My personal projects: http://github.com/KatrinaHoffert

John.d.h

  • Moderator
  • Airship
  • ********
  • Posts: 3,757
  • I have to go now. My planet needs me.
    • View Profile
Re: ATI vs nVidia
« Reply #68 on: 22 December 2010, 09:05:25 »
There is a certain joy in having a powerful computer, you have a bunch of weak laptops so you haven't really experienced it yet. But it's all the graphics competition and the fun of graphics. This new thing is going to piss away all that fun, there will be no more satisfaction in having a powerful computer. Because someone on their smartphone will be able to run a game like Crysis no problem. Crysis only has good graphics because it's just about the best there is, but if you made everygame with CryEngine2 just like Crysis, how awesome would Crysis graphics be? Not at all. In effect it takes down a major drive for game developers and signifigantly weakens the gaming industry. The whole point of games is FUN! Universally good graphics won't be fun, it'll be dull and un-innovative. It's anti-creative, no more need for innovative programming, no more need to design for performance, just blow it up with polygons, nobody cares.
So what you're saying is that you want everyone's graphics to suck so that yours can suck slightly less so you can keep up your smug sense of technological superiority?  Also, that developers who make terrible games that merely look good will crash and burn because they have no idea how to make a game that's actually interesting on any dimension other than "Hey, this one looks pretty, so I guess I'll buy it", so developers who actually have good ideas about making interesting story lines, immersing the player in an interesting setting (which, need I remind you, is the entire point of video games) through their interactions with unique characters and plots, and inventing new gameplay mechanics we haven't experienced before (i.e. actually furthering the art form), will thrive and make lots of money?

(click to show/hide)

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #69 on: 22 December 2010, 12:09:04 »
It would ruin the fun in having a gaming computer! :'( But it would still be awesome, but then any pice of sh*t card can run crysis 3, and multiplayer on some games would flood over with noobs 9 yr olds...*crying*

I think the graphics on the examples they have look 2d-ish though..

http://www.euclideon.com/press.html

It is our hope that in 12 – 16 months we will humbly submit our demos to the approval of our fans and customers.'


:OOOOOOOO!!! in 2012 then ;)
« Last Edit: 22 December 2010, 12:27:25 by Gabbe »

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #70 on: 23 December 2010, 10:05:25 »
There is a certain joy in having a powerful computer, you have a bunch of weak laptops so you haven't really experienced it yet. But it's all the graphics competition and the fun of graphics. This new thing is going to piss away all that fun, there will be no more satisfaction in having a powerful computer. Because someone on their smartphone will be able to run a game like Crysis no problem. Crysis only has good graphics because it's just about the best there is, but if you made everygame with CryEngine2 just like Crysis, how awesome would Crysis graphics be? Not at all. In effect it takes down a major drive for game developers and signifigantly weakens the gaming industry. The whole point of games is FUN! Universally good graphics won't be fun, it'll be dull and un-innovative. It's anti-creative, no more need for innovative programming, no more need to design for performance, just blow it up with polygons, nobody cares.
So what you're saying is that you want everyone's graphics to suck so that yours can suck slightly less so you can keep up your smug sense of technological superiority?  Also, that developers who make terrible games that merely look good will crash and burn because they have no idea how to make a game that's actually interesting on any dimension other than "Hey, this one looks pretty, so I guess I'll buy it", so developers who actually have good ideas about making interesting story lines, immersing the player in an interesting setting (which, need I remind you, is the entire point of video games) through their interactions with unique characters and plots, and inventing new gameplay mechanics we haven't experienced before (i.e. actually furthering the art form), will thrive and make lots of money?

(click to show/hide)

I never said I want everyone else's graphics to suck, but I want them to earn it, not get it through some cheap new system. There's an immense amount of harmless fun through having a powerful gaming system. It also fuels the whole gaming industry. If all the graphics are the same, nobody is going to GIVE A SHIT after 6 months. Why? Because then you might as well read a book. And why pay $100 for the 500th game to use the same exact graphics, when you can get a book that has simply the story which will become the only innovative part, for about $10. I personally hope they price this new stuff real high so it doesn't completely shut down the gaming industry. This will be horrible for AMD and Nvidia.

You clearly haven't played very many high-detail games. Crysis has a great story! Far better than most movies.
They talk about models, hmm.......... How about shaders, particles, lighting, textures and other little effects, there must be a cost for all that! Anyway, Zoythrus did say when he told me about it, that it takes a lot of RAM. GREAT! RAM is a bit more expensive that graphics cards and no, the prices will not continue to drop when this comes out.

No more innovation, just throw everything you've got at it. Too much of anything IS BAD. You can have to much sex, bad food, health food, exercise, couch-potato-ness, ANYTHING!

My final comment: Crysis is amazing why? Not because it has good graphics, but because it can do mass quality physics and amazing particles in real-time, efficiently. And at the same time, boast immense lighting/shading that doesn't bog you're computer much.
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #71 on: 23 December 2010, 10:25:46 »
RAM is probably the cheapest sh!t you`ll ever find :P I could find RAM chipsets clocked at 1600mhz without a huge latency and that was 4GB in one chip. It cost was 400NOK Wich i belive is about 50-80USD but im not sure. Nowadays most motherboards have atleast 4 slots were you can put RAM in them 4*4=16GB running at the current maxspeed of 1600mhz.

Arch, I dont think the games will have the same graphics, because when he was telling us (the maker) about it in the videos on the site you linked to, he said that the modeller could animate and/or model inside a 3d-modeling program and then export into point cloud data wich is his format. That would make every game different in graphic style. It would only make it so that if a modeller is really good with polygons and have the power to create massive high poly stuff then games would probably look real. WHat im also wondering about is shading, as in the video presented, they said they had a problem with their shaders so that flickering occur. Usually that only happen to bridged GPUs.

-Archmage-

  • Moderator
  • Dragon
  • ********
  • Posts: 5,887
  • Make it so.
    • View Profile
    • My Website
Re: ATI vs nVidia
« Reply #72 on: 23 December 2010, 10:54:44 »
RAM is probably the cheapest sh!t you`ll ever find :P I could find RAM chipsets clocked at 1600mhz without a huge latency and that was 4GB in one chip. It cost was 400NOK Wich i belive is about 50-80USD but im not sure. Nowadays most motherboards have atleast 4 slots were you can put RAM in them 4*4=16GB running at the current maxspeed of 1600mhz.

Arch, I dont think the games will have the same graphics, because when he was telling us (the maker) about it in the videos on the site you linked to, he said that the modeller could animate and/or model inside a 3d-modeling program and then export into point cloud data wich is his format. That would make every game different in graphic style. It would only make it so that if a modeller is really good with polygons and have the power to create massive high poly stuff then games would probably look real. WHat im also wondering about is shading, as in the video presented, they said they had a problem with their shaders so that flickering occur. Usually that only happen to bridged GPUs.

Hmm...

About RAM: Of course I don't happen to know how much a lot is, but 12+GB of RAM gets pretty expensive.
Egypt Remastered!

Proof: Owner of glest@mail.com

Gabbe

  • Guest
Re: ATI vs nVidia
« Reply #73 on: 23 December 2010, 11:03:37 »
3*400NOK = 1200NOK ~ 600 USD

John.d.h

  • Moderator
  • Airship
  • ********
  • Posts: 3,757
  • I have to go now. My planet needs me.
    • View Profile
Re: ATI vs nVidia
« Reply #74 on: 23 December 2010, 11:08:52 »
I never said I want everyone else's graphics to suck, but I want them to earn it, not get it through some cheap new system.
So having your parents buy you a fancy computer with a nice graphics card entitles you to something? :|
Quote
There's an immense amount of harmless fun through having a powerful gaming system. It also fuels the whole gaming industry. If all the graphics are the same, nobody is going to GIVE A SHIT after 6 months.
As opposed to how things are now, where they just move on to the next game six months later because the new one is prettier?
Quote
Why? Because then you might as well read a book. And why pay $100 for the 500th game to use the same exact graphics, when you can get a book that has simply the story which will become the only innovative part, for about $10.
OH NO, PEOPLE MIGHT START READING AGAIN! :o :O Sorry, but that was hilarious. :)
Besides, games have so many more tools at their disposal for telling a story.  They can branch out and have many possibilities, things can turn out differently every time you play them, and you get to interact with the characters and environment in a way that you can't through a book.
Quote
I personally hope they price this new stuff real high so it doesn't completely shut down the gaming industry. This will be horrible for AMD and Nvidia.
Because poor people don't deserve nice things, but spoiled kids do?
Quote
You clearly haven't played very many high-detail games. Crysis has a great story! Far better than most movies.
And that has nothing to do with their graphics.  Look at Halo for example.  It's fun, but it's mindless fun.  The characters are completely forgettable, and the plot... wait, what plot?  If you want to play a game where you get to work out your aggression and look at pretty explosions, the Halo series is a good choice, but you're not going to get any intellectual stimulation from it.  I'll take a game with a decent story but outdated graphics over something that's pure graphics porn.
Quote
No more innovation, just throw everything you've got at it.
Wait what?  No more innovation?  This is innovation, real innovation (i.e. coming up with something new that nobody's seen before), as opposed to a very slight tweak by adding more and more polygons every year because now the hardware can handle it.  And you see who's opposing this actual innovation?  Don't complain about a lack of innovation when you're the one opposing it.
Quote
My final comment: Crysis is amazing why? Not because it has good graphics, but because it can do mass quality physics and amazing particles in real-time, efficiently. And at the same time, boast immense lighting/shading that doesn't bog you're computer much.
... and if developers don't have to worry so much about graphics, then they can freely focus on making things like that better, instead of releasing the same damn football game every year with a very slight graphics makeover.  They can explore new gameplay mechanics and ideas that haven't been done before.

I honestly don't see why you're opposing this, unless it's just because you won't get to brag and be smug about the gaming rig your parents bought you. :|