Author Topic: How Does Glest's client keep frameCount in sync with server ?  (Read 1713 times)

loveheaven

  • Guest
Hi Guys,
     I've looked through Glest's source codes these days. I am new to Glest.  I find that Glest's client will check the variable frameCount sent by server to see whether it is in sync with the server. But, how does glest's client keep framecount in sync with the server? The server and the client increase the variable frameCount in each own's loop. If client loops once, it is possible for server to loop twice. So I think the frameCount is hard to keep in sync. Am I right?

Belowing is the related codes.
void World::update(){

   ++frameCount;
   ...........
}
void ClientInterface::updateKeyframe(int frameCount){
   .......
   //check that we are in the right frame
   if(networkMessageCommandList.getFrameCount()!=frameCount){
      throw runtime_error("Network synchronization error, frame counts do not match");
   }
   .......
}

daniel.santos

  • Guest
Re: How Does Glest's client keep frameCount in sync with server ?
« Reply #1 on: 12 January 2009, 20:17:16 »
But, how does glest's client keep framecount in sync with the server?
It doesn't.  When I make a debug build, world updates can occur very slowly, with less than 40 per second because it's not optimized and I have tons of assertions which slow it further.  This can cause the client to be on a drastically different world frame than the server (earlier frame, but not later).  However, you are talking about Glest and this forum is for GAE.  GAE uses a completely different networking implementation, in fact one that is in a radical state of flux at the moment and has never (yet) reached a point of stability & functionality.  I'm a wee bit rusty on some of the details of the original Glest networking implementation, but that is not a condition it attempts to address.  Usually, if both machines are fast enough, this condition wont happen or wont be to bad.

In GAE (for 0.2.12, not out yet) my plan is to modify this to cause clients to wait x amount of time for "key frame" updates and servers another value.  For clients, I'm thinking they will allow themselves to get 25-50 milliseconds (one or two frames at 40 fps) ahead of the server, before forcing them to effectively pause their game to wait for the server to give it the go-ahead.  For the inverse situation, I'm thinking perhaps 1.5 seconds or a dynamic amount depending upon ping times -- so that the server will actually pause the game and tell all other clients to pause until the slowest client catches up.

I don't like either of those solutions and I prefer a more elegant approach, but I think that this change will be enough for now.  I have tentative plans to alter the way "world updates" happen in the game, forcing stuff to be relative to the time the world update started rather than the actual frame count -- thus eliminating this frame count sync methods all together and go strictly by server time (in GAE 0.2.x branch, see the Game::Net::NetworkMessagePing class -- it has a mechanism to share local times of each machine and calculate the latency both ways, instead of just round trip latency).  None of this is in yet and I don't intend on implementing this in 0.2.12 and probably not in the 0.2 branch at all.

So if you have interest, check out the 0.2.x branch of GAE, but keep in mind that I just made radical alterations to the networking that are no where near complete and it is not even functional yet (for network play).  There is a thread "chop chop chop (network code)" where we've been discussing this.  Ideally, it would be better to start looking at it *after* I'm done with 0.2.12 :)

loveheaven

  • Guest
Re: How Does Glest's client keep frameCount in sync with server ?
« Reply #2 on: 13 January 2009, 05:10:00 »
Thank you. You mean glest' client cannot keep in sync with server? I've been thinking Glest was a perfect game...