Wouldn't this defeat the purpose of defining with 1 and 0?
hmm, I hadn't thought of that. So the defining them as ones or zeros just to have them all in the build script/project huh? That's indeed a thought. (Was that your point silnarm?
) But it also makes it easier to mix the pre-processor code with C++ code, like this:
if(DEBUG_WORLD) {
// do some stuff
}
And when it's optimized (even minimally), it will be removed as dead code. But also, you can set debug levels:
if(DEBUG_WORLD >2) {
// do even more stuff
}
And yea, sub-options is a good idea too. I had to rename it gae_features.h (which I wasn't too happy about) because it was conflicting with gcc libstdc++'s "features.h" for some stupid reason.
So here's what mine looks like thus far (and I'm still working out all of my compile errors after this last singleton massacre:
#ifndef USE_PTHREAD
# define USE_PTHREAD 0
#endif
#ifndef USE_SDL
# define USE_SDL 0
#endif
#ifndef USE_OPENAL
# define USE_OPENAL 0
#endif
#ifndef USE_DS8
# define USE_DS8 0
#endif
#ifndef USE_POSIX_SOCKETS
# define USE_POSIX_SOCKETS 0
#endif
#ifndef SL_LEAK_DUMP
# define SL_LEAK_DUMP 0
#endif
#ifndef USE_SSE_INTRINSICS
# define USE_SSE_INTRINSICS 0
#endif
#ifndef ALIGN_12BYTE_VECTORS
# define ALIGN_12BYTE_VECTORS USE_SSE_INTRINSICS
#endif
#ifndef ALIGN_16BYTE_VECTORS
# define ALIGN_16BYTE_VECTORS USE_SSE_INTRINSICS
#endif
#ifndef DEBUG_NETWORK
# define DEBUG_NETWORK 0
#endif
#ifndef DEBUG_NETWORK_DELAY
# define DEBUG_NETWORK_DELAY 0
#endif
#ifndef DEBUG_NETWORK_DELAY_VAR
# define DEBUG_NETWORK_DELAY_VAR 0
#endif
#ifndef DEBUG_WORLD
# define DEBUG_WORLD 0
#endif
#ifndef DEBUG_PATHFINDER
# define DEBUG_PATHFINDER 0
#endif
#ifndef DEBUG_TEXTURES
# define DEBUG_TEXTURES 0
#endif
#ifndef USE_xxxx
# define USE_xxxx 0
#endif
// Sanity checks
#if defined(DEBUG) && defined(NDEBUG)
# error "Don't set both DEBUG and NDEBUG in the same build please."
#endif
#if DEBUG_NETWORK_DELAY && !DEBUG_NETWORK
# error DEBUG_NETWORK_DELAY requires DEBUG_NETWORK to be set
#endif
#if USE_SSE_INTRINSICS && !(ALIGN_12BYTE_VECTORS && ALIGN_16BYTE_VECTORS)
# error if USE_SSE_INTRINSICS is set, then ALIGN_12BYTE_VECTORS and ALIGN_16BYTE_VECTORS must both be set.
#endif
Finally, it may already be possible to set USE_SDL and compile on windows, I'm not certain. As I've been going back through a lot of this code, I've taken great care (in the past) to choose the
defined(USE_SDL) option prior to
defined(WIN32) || defined(WIN64), except in cases where the SDL facilities were weaker (threads & timers). Also, I sure would like to get it working as a 64-bit Windows executable for Vista & Win7. I have some code in shared_lib/*/platform/thread.h that I would want to validate as working that's for win64 only (the Shared::Platform::Condition class).
Lastly, I would also like to get pthread support working. Honestly, I don't think I've even tried to do a test compile yet.
While I suppose it's not terribly important, it would give us a lot of capabilities in regards to:
- Being able to select on multiple objects that includes threads (as well as sockets). This isn't a huge deal, but it would make my network thread's main loop slightly cleaner.
- Altering thread priority. Whenever we get around to implementing multi-threaded support for core stuff like world-updates, rendering, etc., this would be helpful.