Author Topic: [Solved] OpenGL error 1281: invalid value  (Read 8549 times)

daniel.santos

  • Guest
[Solved] OpenGL error 1281: invalid value
« on: 21 February 2008, 03:17:47 »
When building on Windows using Visual C++ 2008 Express Edition, I get the error "OpenGL error: invalid value at file: c:\projects\glest\glest\source\shared_lib\sources\graphics\gl\particle_renderer_gl.cpp, line 133" the 1st time an AttackParticleSystem is created.  To recreate, I start up Glest on the magic faction and nuke my initiate with a battle mage.  As soon as the fireball would otherwise appear out of the battle mage's hands, it crashes here.  If I run in debug, I can step set a breakpoint there and it's actually the 2nd time it reaches ParticleRendererGl::renderSystem() that the call to glGetError() returns 1281.  I have a crappy Mobile Intel 945GM Express video chipset, but I can play the pre-compiled 3.1.0 without this problem.  Research on the net reveals that crappy video cards are often the cause of this error in some other games.  Perhaps Martino is building and/or linking against some different opengl headers/libs that don't cause this problem?

Here is what I'm using:
  • Windows XP SP2
  • Visual C++ 2008 Express Edition
  • Ogg/Vorbis win32 sdk v 1.0.1
  • xerces-c 2.8.0
  • SGI's GLsdk with modified glprocs.c & glprocs.h (replacing the "const" functions with "const char*" as described in thread)
  • DirectX SDK Nov2007
  • not using w32api, and leaving the default include and lib paths that hit this crappy "Windows SD v 6.0a" that comes with VC++ Express.  BTW, this doesn't have glaux.lib but I removed it from the linker config and I didn't get any unresolved symbols.
  • I am currently producing this error using r125 from svn, but I've gotten a few other previous revisions to do it as well.
Any ideas on what's wrong here?  Would I have better luck using the w32api instead of the default Windows SDK v6.0a?  Perhaps a different GL extensions manager library like GLEW?  I don't know OpenGL or DirectX libraries very well at all.  All help appreciated.

If it helps at all, this is the entire stack trace:
  • >game.exe!Shared::Graphics::Gl::_assertGl(const char * file=0x00796600, int line=133)  Line 53 + 0x1fc bytes   C++
  • game.exe!Shared::Graphics::Gl::ParticleRendererGl::renderSystem(Shared::Graphics::ParticleSystem * ps=0x23c38690)  Line 133 + 0x13 bytes   C++
  • game.exe!Shared::Graphics::AttackParticleSystem::render(Shared::Graphics::ParticleRenderer * pr=0x08df8d48, Shared::Graphics::ModelRenderer * mr=0x0038f548)  Line 389 + 0x13 bytes   C++
  • game.exe!Shared::Graphics::ParticleManager::render(Shared::Graphics::ParticleRenderer * pr=0x08df8d48, Shared::Graphics::ModelRenderer * mr=0x0038f548)  Line 650 + 0x2d bytes   C++
  • game.exe!Shared::Graphics::Gl::ParticleRendererGl::renderManager(Shared::Graphics::ParticleManager * pm=0x003872b8, Shared::Graphics::ModelRenderer * mr=0x0038f548)  Line 65   C++
  • game.exe!Glest::Game::Renderer::renderParticleManager(Glest::Game::ResourceScope rs=rsGame)  Line 379 + 0x2a bytes   C++
  • game.exe!Glest::Game::Game::render3d()  Line 588   C++
  • game.exe!Glest::Game::Game::render()  Line 245   C++
  • game.exe!Glest::Game::Program::loop()  Line 123 + 0x15 bytes   C++
  • game.exe!Glest::Game::glestMain(int argc=1, char * * argv=0x00386b00)  Line 157   C++
  • game.exe!main(int argc=1, char * * argv=0x00386b00)  Line 172 + 0xd bytes   C++
  • game.exe!__tmainCRTStartup()  Line 582 + 0x19 bytes   C
  • game.exe!mainCRTStartup()  Line 399   C
  • kernel32.dll!7c816fd7()    
  • game.exe!_memchr()  + 0x210c6 bytes   C++
  • f00dbaad()
« Last Edit: 21 February 2008, 05:07:58 by daniel.santos »

daniel.santos

  • Guest
(No subject)
« Reply #1 on: 21 February 2008, 03:51:55 »
OK, this is only happening in the debug build, so this error is probably happening in the pre-compiled version as well due to my video chip not supporting one of the calls in ParticleRendererGl::renderSystem(), but it's getting ignored in the release build.
« Last Edit: 1 January 1970, 00:00:00 by daniel.santos »