Hi guys,
In my game I use os.time() on initiation of the app and compare it to a os.time() stamp from when the app was exited.
With that time difference I can simulate that the game goes on while it was actually closed. (it's a farm game where you don't do much but upgrade stuff and earn money).
Because my automatons, which generate ingame money have an inbuilt time to live, I have to calculate an effective time spent: basically timeDifference on app init can't be longer than current time to live of the automaton on app exit.
I figured out how to implement this and tested the behavior with success. For example refreshed an automaton with a base time to live of 1 minute, exited the app and came back 10 minutes later. It produced only money for 1 minute, as expected.
But then, if I do the same thing and come back after several hours, it generates huge amounts of money, like it would have been up all that time.
I don't understand this... os.time() gives a Unix timestamp or? So it shouldn't behave strangely even if days pass. Anyone have an idea what I'm missing?
Comments
I can see something like this :
But thanks for the tipp tkhnoman, I didn't know os.date() until now