On Jul 16, 2004, at 4:54 AM, Rob Alexander wrote:
> I've written a simulation in which, when visualised, should run
> synchronised with wall-clock time. At the moment, I'm using a 1:1 ratio
> between sim and real time.
> Under Windows this works, the frame rate is sustained at very nearly 35
> per second. On a comparably-specced Linux machine, however, the frame
> rate tops out at about 25.8, some 40% slower. If I set the sleep time
> zero, the Linux box sustains a frame rate in excess of 200.
The console checks for play sleeping in the following manner (see
if (sleep > 0 &&
I can see two sources of possible weirdness here. First and most
likely, although it's not mentioned in the documentation, Java's
Thread.sleep(...) method doesn't exactly sleep for so many
milliseconds; it's usually a lower bound on the length of sleep.
There's an alternative Thread.sleep(millis,nanos) which might be more
accurate on Linux -- I dunno -- though the default implementation for
it in the underlying Java source just calls Thread.sleep(millis),
essentially throwing away the nanos.
Another strong possibility could be that Linux is exceptionally slow in
testing Thread.currentThread() or thread.isInterrupted(). Those would
only get called if sleep were presently engaged.
getThreadShouldStop() should not be an issue -- it'd only be called if
the thread had been interrupted, which is rare.
Last, could it be a drawing issue? What happens when you close the
I think your next step in your plan of attack should be to test the
model without visualization, but in your loop main loop, before you
call schedule.step(), experiment with (1) a call to
Thread.sleep(MILLISECONDS_PER_TICK), and (2) a bogus call to
Thread.currentThread().isInterrupted() to see if either may be causing