Hence this discussion: if someone can tell me where I 'm wrong, I 'd be grateful because I sure can't find a flaw in this logic myself:
Martin's reply to my question said:
> No, time() is definitely the absolute number of seconds since epoch, which
> is January 1 1970 00:00:00 GMT.
...and so we can use the return value of time() to timestamp an event in GMT. Here's why I don't think this is enough.
If a server's timezone is GMT+0000, then one day after epoch its time() function
would return 86400. However, at the exact same moment, a server with TZ GMT+0200
would return 86400 + 7200 for time(). This is of course obvious, as the GMT+02
server's clock wouldn't be one day (24 hours) but 26 hours after epoch.
So how can the number of seconds since epoch be "absolute" on any given machine?
On my home XP system I did a few tests (changing the timezone from the control panel), and time() returned different values when I did so. This HAS to indicate it's timezone-dependent (at least under Windows, but why should Linux be different?).
From the PHP docs on date():
date('O') returns "Difference to Greenwich time (GMT) in hours" in a string that looks like "+0200".
Wouldn't the correct solution be this?
$localtime = time();
$diff_from_gmt = intval(date('O'), $localtime); // May be negative
// Parse the number we have... remember that 2.5 hours after GMT
// will not be returned as +0250 but as +0230, so we cannot use
// 3600 * ($diff_from_gmt / 100) to offset the time
$diff_hours = intval($diff_from_gmt / 100);
$diff_mins = $diff_from_gmt % 100;
$gmt_time = $localtime - ($diff_hours * 60 + $diff_mins) * 60;
I 've been a bit liberal with the variables for the sake of clarity.
Where is the flaw in my reasoning?