c# - How to avoid loss of precision in .NET for converting from Unix Timestamp to DateTime and back? -


consider following snippet

var original = new datetime(635338107839470268); var unixtimestamp = (original - new datetime(1970,1,1)).totalseconds; // unixtimestamp 1398213983.9470267  var = new datetime(1970,1,1).addseconds(1398213983.9470267); // back.ticks 635338107839470000  

as can see ticks value got different started with.

how can avoid loss of precision in c# while converting date unix timestamp , back?

http://msdn.microsoft.com/en-us/library/system.datetime.addseconds.aspx

datetime.addseconds() per documentation rounds nearest millisecond (10,000 ticks).

using ticks:

// have datetime in memory datetime original = new datetime(635338107839470268);  // convert unix timestamp double unixtimestamp = (original - new datetime(1970, 1, 1)).totalseconds;  // unixtimestamp saved somewhere  // user needs make 100% precise datetime unix timestamp datetime epochinstance = new datetime(1970, 1, 1); datetime = epochinstance.addticks((long)(unixtimestamp * timespan.tickspersecond)); // back.ticks 635338107839470268 

Comments

Popular posts from this blog

java - Intellij Synchronizing output directories .. -

git - Initial Commit: "fatal: could not create leading directories of ..." -