The future is fushion!
I`ve never wrote such thing :| :| :| :| ::) ::) ::) ::) ::) ::) :cheesy: :cheesy: :dead: :look:
but even Wesnoth runs pretty slowly, and Glest gives me a whopping one frame per second.
Just to toss my hat into the ring (or however the expression goes), this new netbook of mine has a VIA, and it's got no 3D acceleration on Linux, so it's pretty much worthless for gaming. This machine's plenty fast when it doesn't have to render anything, but even Wesnoth runs pretty slowly, and Glest gives me a whopping one frame per second. :thumbdown:VIA huh? hmm...I remember them...
Ouch. :-XBut, better than 0. :|
lol, i know someone with a ATI card in a a laptop...
and it sucks horrendeous.
(okay, i dont know, but sersly, dont take taptops into this we all know laptop GFX sucks anway no need to get low).
lol, i know someone with a ATI card in a a laptop...
im not going to comment on laptop comments. srsly.No way! My laptop has a HD ATI graphics card with 1GB graphics memory. I've tried every game I have on my computer at the highest settings. Not even the slightest choppy graphics, and always over 20fps even on supreme graphics! My laptops owns both of my dad's computers put together! Really, how can it not? It even has a higher resolution (at 1600x900, pretty much the biggest popular widescreen resolution. There's bigger, but not common). GAE normal to GAE DE? Not even the slightest graphical impairment!
Desktop is superior to laptop.
ATI-VS-Nvida is a tastechoice
sometimes Nvidia is in front other times it is ATI.
nuff said.
Chupareaper: I'm running a Windows 7 64-bit laptop, on an ATI RADEON HD 3200, it's working great! Whatever problem you have is with your computer not with ATI.
i hate intel, still, gotta love ATI, 3 display one card FTW! go eyefinity! soon with 3d too. imagine three monitors with 3d.AMD? You're kidding! Intel has the best processors ever!
I hate intels overpiced processors! they have just as much power as the amd hexa cores, still they cost 5 times the price.
ATI AMD! FUTURE IS FUSION!
My Dad spent hours and hours making nVidia work, ATI came right up, and runs dual-booted Linux and Win7 FLAWLESSLY.
Chipsets always suck no matter what, I've learned my lesson from nVidia, shop ATI dedicated graphics or don't shop at all. :P
ATI is raw awesomeness. nVidia is big bulks of weak power that brags about PhysX. If only AMD hadn't bought ATI, but rather formed a "dual-company".............
You can change on notebooks for a fair price, it is just harder and you get less performance.
Laptops don't have to have bad graphics cards, they are simply usually limited in modifying them, thus making it important to get a good laptop when you buy it. My laptop's graphics card can match any desktop, and exceed that of most desktops I've seen anyway. I know that windows experience index is a bad way to compare things, but having no other benchmark, my laptop hits 6.6 in both graphics (it's a scale from 1.0 - 7.9), and its the highest out of the categories.
u
Laptops suffer from too many stereotypes because of some bad laptops. If you keep away from cheap laptops, you'll find plenty of hidden gems... Oh, and don't buy unknown brands, no matter their "stats", because they tend to perform worse than better brands of the same "stats". HP is my official favorite now, after performing so good for me for so long...
It isn`t a stereotype, on laptops you have limited power, limited airspace and so onI can't really argue with that. Desktops aren't nearly so limited by space and compactness of design, so you can have more processing cores, bigger cooling vents, etc., that you just can't fit into a laptop. Laptops tend to be noisy and hot, and have you ever tried to use Blender or the GIMP on one of these?
<snip/>
Laptops just aren`t made for gaming and thats something someone has to get over.
I use my laptop for school projects. I take it traveling. I go lots of places with it. See why laptops win?
Quote from: -Archmage- on December 11, 2010, 01:39:43Oh.......... What do I know? :P Thanks. :)
My Dad spent hours and hours making nVidia work, ATI came right up, and runs dual-booted Linux and Win7 FLAWLESSLY.
Chipsets always suck no matter what, I've learned my lesson from nVidia, shop ATI dedicated graphics or don't shop at all. Tongue
ATI is raw awesomeness. nVidia is big bulks of weak power that brags about PhysX. If only AMD hadn't bought ATI, but rather formed a "dual-company".............
AMD owned ATI always, they just changed the name.
QuoteYou can't fairly compare you're computer with anyone's, you're is like a $5000 computer, with 4 graphics cards and a 6 core processor, all overclocked heavily. :P
You can change on notebooks for a fair price, it is just harder and you get less performance.
Laptops don't have to have bad graphics cards, they are simply usually limited in modifying them, thus making it important to get a good laptop when you buy it. My laptop's graphics card can match any desktop, and exceed that of most desktops I've seen anyway. I know that windows experience index is a bad way to compare things, but having no other benchmark, my laptop hits 6.6 in both graphics (it's a scale from 1.0 - 7.9), and its the highest out of the categories.
u
Laptops suffer from too many stereotypes because of some bad laptops. If you keep away from cheap laptops, you'll find plenty of hidden gems... Oh, and don't buy unknown brands, no matter their "stats", because they tend to perform worse than better brands of the same "stats". HP is my official favorite now, after performing so good for me for so long...
It isn`t a stereotype, on laptops you have limited power, limited airspace and so on, that makes it so that you must have a downclocked card or a fan worse than your local airport when ten jets lift off at the same time crashing into eachother. Your laptops card cannot match my desktop in any possible way, and a long shot from exceed it. You should also give all the infromation the indez gives you (RAM, AERO performance, data transfer numbers processing power and so on) then you can get a good comparable result. Dell delivers power and Asus delivers power for a fair price, I don`t know much about new HPs, but the older ones is just tragic dump Tongue Laptops just aren`t made for gaming and thats something someone has to get over.
I use a plug in USB mouse with my laptop, helps me game when I'm at my Mom's.It isn`t a stereotype, on laptops you have limited power, limited airspace and so onI can't really argue with that. Desktops aren't nearly so limited by space and compactness of design, so you can have more processing cores, bigger cooling vents, etc., that you just can't fit into a laptop. Laptops tend to be noisy and hot, and have you ever tried to use Blender or the GIMP on one of these?
<snip/>
Laptops just aren`t made for gaming and thats something someone has to get over.(click to show/hide)
I simply use laptops because wifi, mobility, and space are big priorities for me. I don't have room for a tower, monitor, keyboard, mouse, and all the associated cords and cables...
I use my laptop for school projects. I take it traveling. I go lots of places with it. See why laptops win?Laptops are nice for mobile stuff(not allowed to take my laptop to LV :-[), but for power and stable network connections(taken care of in my case with cable from HUB), desktops are MUCH better.
Yuck, Mac...... Most horrid OS ever written! You've got a powerful version of everything dude. :cheesy:I use my laptop for school projects. I take it traveling. I go lots of places with it. See why laptops win?
No, for you they do, and don`t think i haven`t got a powerful laptop too ;) for travelling purposes and, for other reasons. For school projects i get a free macbook anyways.
Best I tried was a over-maxed Oblivion... Did you know the INI values can exceed the ingame settings?.
My graphics card is a ATI Mobility Radeon HD 4650. Never had any problems, can play games well without lag on the highest settings (abet, I admit I don't have any truly stressing games.)
Quote from: Omega on Today at 01:52:21Yea, I've been modding up Crysis a ton, working on ragdoll properties now. :P I really don't get why Crysis was release with such overdone ragdoll physics, you shoot someone with the scar and they take flight!
Best I tried was a over-maxed Oblivion... Did you know the INI values can exceed the ingame settings?.
I do, and i bet arch does too, crysis can have some huge graphic changes from using the INI or file, i dont remember what it is called, but you can change specularity, normal map density, strenght, water reflection quality, and so on.
Hunter mod is best, beautifull scenery and all.
Just don't plan on gaming on one of these (http://www.sylvanianetbooks.com/). I actually got to play around with one the other day, as my landlord just got one for a not-very-tech-savvy friend.
(http://digitalgadgets.com/images/product1.jpg)
Just don't plan on gaming on one of these (http://www.sylvanianetbooks.com/). I actually got to play around with one the other day, as my landlord just got one for a not-very-tech-savvy friend.Hey that looked awesome! is that the millenium version of W2k? oh man so nostalgic i want it!
(http://digitalgadgets.com/images/product1.jpg)
Hey that looked awesome! is that the millenium version of W2k? oh man so nostalgic i want it!No, that Windows ME you're thinking of. CE is like a mobile version of Windows 98. It actually runs pretty fast, which should be obvious when you think about the fact that computers with 128MB of RAM were just as fast as the computers we have now, before operating systems got so bloated that you need a gig just to boot up. :| For what most people use their computers for on a daily basis (email, facebook, news, kitty pictures), it does it fairly well. Of course, it's not compatible with much modern software, and it's got like 2GB solid state drive, so you're basically stuck with what comes with it, including Internet Explorer, unless you find some bare-bones *nix that will run on it.
Ok.
Germany is getting a new supercomputer, with processingpower of 3PETAFLOPS, but soon in 2011 USA will be getting one that can process 20 PETAFLOPS. I want one PETAbyte of HDD!
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere? How is there anything at all to like about that?
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere? How is there anything at all to like about that?
(I guess if hardware stops becoming obsolete as fast, manufacturers will just have to make it unreliable, so it breaks more often. ::))
It actually sounds workable and will(hopefully) be amazing. But it'll take all the fun out of computer hardware. :look: Which is why I actually don't want it to happen. All competition will be gone, it may look good but it will completely destroy the fun of having a good system.Because you like having to pay thousands of dollars over time to upgrade your system over and over, so all your old parts end up in a landfill somewhere? How is there anything at all to like about that?
(I guess if hardware stops becoming obsolete as fast, manufacturers will just have to make it unreliable, so it breaks more often. ::))
There is a certain joy in having a powerful computer, you have a bunch of weak laptops so you haven't really experienced it yet. But it's all the graphics competition and the fun of graphics. This new thing is going to piss away all that fun, there will be no more satisfaction in having a powerful computer. Because someone on their smartphone will be able to run a game like Crysis no problem. Crysis only has good graphics because it's just about the best there is, but if you made everygame with CryEngine2 just like Crysis, how awesome would Crysis graphics be? Not at all. In effect it takes down a major drive for game developers and signifigantly weakens the gaming industry. The whole point of games is FUN! Universally good graphics won't be fun, it'll be dull and un-innovative. It's anti-creative, no more need for innovative programming, no more need to design for performance, just blow it up with polygons, nobody cares.So what you're saying is that you want everyone's graphics to suck so that yours can suck slightly less so you can keep up your smug sense of technological superiority? Also, that developers who make terrible games that merely look good will crash and burn because they have no idea how to make a game that's actually interesting on any dimension other than "Hey, this one looks pretty, so I guess I'll buy it", so developers who actually have good ideas about making interesting story lines, immersing the player in an interesting setting (which, need I remind you, is the entire point of video games) through their interactions with unique characters and plots, and inventing new gameplay mechanics we haven't experienced before (i.e. actually furthering the art form), will thrive and make lots of money?
There is a certain joy in having a powerful computer, you have a bunch of weak laptops so you haven't really experienced it yet. But it's all the graphics competition and the fun of graphics. This new thing is going to piss away all that fun, there will be no more satisfaction in having a powerful computer. Because someone on their smartphone will be able to run a game like Crysis no problem. Crysis only has good graphics because it's just about the best there is, but if you made everygame with CryEngine2 just like Crysis, how awesome would Crysis graphics be? Not at all. In effect it takes down a major drive for game developers and signifigantly weakens the gaming industry. The whole point of games is FUN! Universally good graphics won't be fun, it'll be dull and un-innovative. It's anti-creative, no more need for innovative programming, no more need to design for performance, just blow it up with polygons, nobody cares.So what you're saying is that you want everyone's graphics to suck so that yours can suck slightly less so you can keep up your smug sense of technological superiority? Also, that developers who make terrible games that merely look good will crash and burn because they have no idea how to make a game that's actually interesting on any dimension other than "Hey, this one looks pretty, so I guess I'll buy it", so developers who actually have good ideas about making interesting story lines, immersing the player in an interesting setting (which, need I remind you, is the entire point of video games) through their interactions with unique characters and plots, and inventing new gameplay mechanics we haven't experienced before (i.e. actually furthering the art form), will thrive and make lots of money?(click to show/hide)
RAM is probably the cheapest sh!t you`ll ever find :P I could find RAM chipsets clocked at 1600mhz without a huge latency and that was 4GB in one chip. It cost was 400NOK Wich i belive is about 50-80USD but im not sure. Nowadays most motherboards have atleast 4 slots were you can put RAM in them 4*4=16GB running at the current maxspeed of 1600mhz.
Arch, I dont think the games will have the same graphics, because when he was telling us (the maker) about it in the videos on the site you linked to, he said that the modeller could animate and/or model inside a 3d-modeling program and then export into point cloud data wich is his format. That would make every game different in graphic style. It would only make it so that if a modeller is really good with polygons and have the power to create massive high poly stuff then games would probably look real. WHat im also wondering about is shading, as in the video presented, they said they had a problem with their shaders so that flickering occur. Usually that only happen to bridged GPUs.
I never said I want everyone else's graphics to suck, but I want them to earn it, not get it through some cheap new system.So having your parents buy you a fancy computer with a nice graphics card entitles you to something? :|
There's an immense amount of harmless fun through having a powerful gaming system. It also fuels the whole gaming industry. If all the graphics are the same, nobody is going to GIVE A SHIT after 6 months.As opposed to how things are now, where they just move on to the next game six months later because the new one is prettier?
Why? Because then you might as well read a book. And why pay $100 for the 500th game to use the same exact graphics, when you can get a book that has simply the story which will become the only innovative part, for about $10.OH NO, PEOPLE MIGHT START READING AGAIN! :o :O Sorry, but that was hilarious. :)
I personally hope they price this new stuff real high so it doesn't completely shut down the gaming industry. This will be horrible for AMD and Nvidia.Because poor people don't deserve nice things, but spoiled kids do?
You clearly haven't played very many high-detail games. Crysis has a great story! Far better than most movies.And that has nothing to do with their graphics. Look at Halo for example. It's fun, but it's mindless fun. The characters are completely forgettable, and the plot... wait, what plot? If you want to play a game where you get to work out your aggression and look at pretty explosions, the Halo series is a good choice, but you're not going to get any intellectual stimulation from it. I'll take a game with a decent story but outdated graphics over something that's pure graphics porn.
No more innovation, just throw everything you've got at it.Wait what? No more innovation? This is innovation, real innovation (i.e. coming up with something new that nobody's seen before), as opposed to a very slight tweak by adding more and more polygons every year because now the hardware can handle it. And you see who's opposing this actual innovation? Don't complain about a lack of innovation when you're the one opposing it.
My final comment: Crysis is amazing why? Not because it has good graphics, but because it can do mass quality physics and amazing particles in real-time, efficiently. And at the same time, boast immense lighting/shading that doesn't bog you're computer much.... and if developers don't have to worry so much about graphics, then they can freely focus on making things like that better, instead of releasing the same damn football game (https://secure.wikimedia.org/wikipedia/en/wiki/Madden_football) every year with a very slight graphics makeover. They can explore new gameplay mechanics and ideas that haven't been done before.
Quote from: -Archmage- on Today at 05:05:25Nobody bought me a nice new computer. I got a Radeon HD 5870. It's not going to mean anything in a couple years, AMD and Nvidia will fall, no more awesome thriving graphics competition.
I never said I want everyone else's graphics to suck, but I want them to earn it, not get it through some cheap new system.
So having your parents buy you a fancy computer with a nice graphics card entitles you to something? No Opinion
QuoteYou surely haven't played Crysis or Modern Warfare 2 which are both tremendous games. But true, a lot of games aren't really anything new these days.
There's an immense amount of harmless fun through having a powerful gaming system. It also fuels the whole gaming industry. If all the graphics are the same, nobody is going to GIVE A SHIT after 6 months.
As opposed to how things are now, where they just move on to the next game six months later because the new one is prettier?
QuoteYou realize that people will still focus on graphics, and it will very likely be much the same as it is, aside from the graphics.
Why? Because then you might as well read a book. And why pay $100 for the 500th game to use the same exact graphics, when you can get a book that has simply the story which will become the only innovative part, for about $10.
OH NO, PEOPLE MIGHT START READING AGAIN! Shocked Laughing Sorry, but that was hilarious. Smile
Besides, games have so many more tools at their disposal for telling a story. They can branch out and have many possibilities, things can turn out differently every time you play them, and you get to interact with the characters and environment in a way that you can't through a book.
QuoteI'm not spoiled at all, in fact my family is devorced and rather poor. We do one time things and do them right, that doesn't make us rich. It feels like socialism to me, everybodies gets stuff they didn't pay for, even though in this case it's due to technological advancement and of course they ARE getting what they paid for, it just doesn't feel right. I would support it I just think it's going to spoil graphics............even more.
I personally hope they price this new stuff real high so it doesn't completely shut down the gaming industry. This will be horrible for AMD and Nvidia.
Because poor people don't deserve nice things, but spoiled kids do?
QuoteDon't even try to compare Halo and MW2, MW2 has a tremendous storyline, and Crysis has unbelievable graphics and storyline.
You clearly haven't played very many high-detail games. Crysis has a great story! Far better than most movies.
And that has nothing to do with their graphics. Look at Halo for example. It's fun, but it's mindless fun. The characters are completely forgettable, and the plot... wait, what plot? If you want to play a game where you get to work out your aggression and look at pretty explosions, the Halo series is a good choice, but you're not going to get any intellectual stimulation from it. I'll take a game with a decent story but outdated graphics over something that's pure graphics porn.
QuoteI guess I tend to ignore games like halo and stupid repetitive crap. Because all I play are heavy duty games that are immensely good. Crysis was and will forever be the greatest innovation of gaming. It used an incredibly bad polygon-based type of rendering and made it unbelievable. I agree this will be a pretty awesome innovation, but I still don't like an iphone being as powerful as my custom built computer, which no was not randomly bought, it has been custom built and upgraded carefully. I mean all that careful research and hardware, poof, I just want to strangle somebody.
No more innovation, just throw everything you've got at it.
Wait what? No more innovation? This is innovation, real innovation (i.e. coming up with something new that nobody's seen before), as opposed to a very slight tweak by adding more and more polygons every year because now the hardware can handle it. And you see who's opposing this actual innovation? Don't complain about a lack of innovation when you're the one opposing it.
Quote
My final comment: Crysis is amazing why? Not because it has good graphics, but because it can do mass quality physics and amazing particles in real-time, efficiently. And at the same time, boast immense lighting/shading that doesn't bog you're computer much.
... and if developers don't have to worry so much about graphics, then they can freely focus on making things like that better, instead of releasing the same damn football game every year with a very slight graphics makeover. They can explore new gameplay mechanics and ideas that haven't been done before.
I honestly don't see why you're opposing this, unless it's just because you won't get to brag and be smug about the gaming rig your parents bought you. No Opinion
AMD & nVidia will buy out this project and shut it down.
ATI & nVidia wants $ more than fun when it comes to gaming :P
I`ve checked with my local techstore, and this is laptop only, maybe for poor performance desktops too, but this will never outman the real size GPU/CPU
Thats wrong, it is a chipset, very small, that combines the CPU & GPU + northbridge. My local Techstore oderes those small things :P
Intel is far better than ATI and NVIDIA! They sell a lot more graphic chips than those two.Intel is known for bad graphics. Both AMD and NVidia surpass it in the graphics department.
10 billion flies can't be wrong, eat shit!Double-u T Effffh Titi.
Intel is far better than ATI and NVIDIA! They sell a lot more graphic chips than those two.
10 billion flies can't be wrong, eat shit!
(http://t1.gstatic.com/images?q=tbn:ANd9GcQIXaSMFacAE6d1ywME3EW415WyGQ3jvLNubobolygnKkRFgAJU8A)
Is this thing broken?
Hard to determine what is trolling in titis post when he has fact but uses a language that i dont think he uses frequently.That's actually incorrect. Unless we're all speaking deutsch @the irc channels... :D
Unlimited detail, using point cloud data = no AA at all. :thumbdown:
Vids?http://www.atomontage.com/?id=dev_blog
I'd like to see high res texes and shading. :P