it is complex to understand the canlogs without milliseconds, is there any function that prints milliseconds in write window. i've already tried with "getLocalTimeString()" function but this will print only time till seconds only.
Try using the funtion timeNowNS(); returning a float variable of the simulation time in nanoseconds. Alternatively use timeNowInt64();. Multiply the returned value with factor to gain seconds/milli seconds as you see fit.
timeNowNS() will return simulation time in nanoseconds whereas getLocalTime()/getLocalTimeString() will return system time.
appending both will make no sense as it won't be accurate
Related
So I'm running a LUA script that executes every minute, this is controlled by software and I can't control the timing of the execution. I would like to check the time every day and trigger a function at a specific time. However, I would like to execute another script 5 minutes before that happens.
Right now I'm doing a string comparison using os.date() and parsing it to a string. It works, but the code isn't pretty and if the time changes, i have to manually change the time in two different variables, I'm pretty new to LUA so I've been having difficulty figuring out the best way to do this.
So the question is, how do I set a time variable, and compare that variable to os.date (or os.time) ?
There's no fancy way to do timers in pure Lua. What you can do to avoid os.date() strings, is to generate timestamps with preset dates, os.time accepts a key-value table to set a date:
timestamp = os.time({year=2019, month=1, day=n})
By iteratively increasing the n variable, you will receive a timestamp for every new n day after January 1st 2019. E.g.
os.date("%Y-%m-%d %H-%M-%S",os.time({year=2019,month=1,day=900})
--> 2021-06-18 12-00-00
If you can't save the day variable to keep track of (between application restarts), get the current "today" day and iterate from there:
os.date("%Y-%m-%d %H-%M-%S",
os.time({year=os.date("%Y"),month=os.date("%m"),day=os.date("%d")+n}
)
Using os.date with custom format and os.time makes your code independent of currently set date locale.
After you have determined the timestamp of the first task, offset the second actual task by five minutes secondTaskTimestamp = fistTaskTimestamp + 5*60 (or use os.time again). Your timer checker only should compare timestamps by now.
Now when you have to change the pre-configured time, you will only have to change the time-date of the first task, and the second task will be automatically offset.
Related: How do I execute a global function on a specific day at a specific time in Lua?
On iOS there is CACurrentMediaTime function that provides timestamp with nanosecond precision.
This function is being used by global.nativePerformanceNow react-native function.
What I don't understand is what the returned value of this function represents, or since when the timestamp is calculated.
global.nativePerformanceNow() seems to call the native functions CACurrentMediaTime() on iOS, and clock_gettime() on Android.
It seems like global.nativePerformanceNow() returns a floating point number representing the amount of milliseconds since device boot on both platforms but it isn't documented as far as I can find.
I am learning Maxima, but having hard time finding how to obtain the cpu time used in call to integrate, when inside a loop construct.
The problem is that the function time(%o1) gives the CPU time used to compute line %o1.
But inside a loop, the whole loop is taken as one operation. So I can't use time() to time single call.
Here is an example
lst:[sin(x),cos(x)];
for i thru length(lst) do
(
result : integrate( lst[i],x)
);
I want to find the cpu time used for each call to integrate, not the cpu time used for the whole loop. Adding showtime: true$ does not really help. I need to obtain the CPU time used for each call, and save the value to a variable.
Is there a way in Maxima to find CPU time used by each call to integrate in the above loop?
Using wxMaxima 15.04.0, windows 7.
Maxima version: 5.36.1
Lisp: SBCL 1.2.7
I was looking for something like Mathematica's AbsoluteTiming function.
instead of elapsed real time, which on my GCL maxima seems to return
absolute real time in seconds, try the lisp function
GET-INTERNAL-RUN-TIME
which you can call from the Maxima command line by
?get-internal-run-time();
This should return run time on any common lisp system. In GCL,
in units of 100 per second.
Perhaps the function you need is elapsed_real_time.
EDIT: you would use it like this:
for i ...
do block ([t0, t1],
t0 : elapsed_real_time (),
integrate (...),
t1 : elapsed_real_time (),
time[i] : t1 - t0);
I’m trying to get system time in microseconds to measure time between two events. I need to measure at minimum to 1/10th of a millisecond.
I know of NSDate and CFAbsoluteTimeGetCurrent, but both only do milliseconds. Does anyone know of a way I can do this?
CFAbsoluteTimeGetCurrent calls gettimeofday, which has microsecond resolution.
NSDate is simply a wrapper around a CFAbsoluteTime, so it also has microsecond resolution.
Why do you think they only have millisecond resolution?
See mach_absolute_time. It has granularity down to nanoseconds.
http://shiftedbits.org/2008/10/01/mach_absolute_time-on-the-iphone/
The IMediaSample SetTime() function expects two REFERENCE_TIME parameters. REFERENCE_TIME is defined as type "LongLong" in Delphi 6, the programming language I am using for my DirectShow application. However, the first parameter of the Callback method that the DirectShow sample grabber filter uses to pass the sample time of a new media sample is cast as double. How do I convert between these two values so I can compare the sample time's between media sample's I receive from the sample grabber filter and the REFERENCE_TIME values that I generate in my push source filter's FillBuffer() method?
Also, would the sample time that is provided by the Sample Grabber filter in the callback method be considered the Start time of a media sample, or the End time?
Simple part: double is in seconds, and REFERENCE_TIME is in 100 ns units. Hence the conversion is simple: multiple or divide by 1E+7.
Not so simple one: you capture some time in grabber in one filter graph, and you time stamp data in your filter in another graph. Both graphs have time stamps to indicate streaming/presentation time, which is relative to graph "run time". That is, when media sample is passed between graphs, there might also a time stamp offset involved.
As for end time, with video media samples, sample stop time may be omitted or set equal to start time; with audio stop time is normally something you can compute by adding start time to time of payload data the buffer holds.
Bonus reading on MSDN: Time and Clocks in DirectShow
To me it also has been a bit difficult to think in 100 nanosecond units. So I also often convert between milliseconds and the 100 ns units. Although it is pretty trivial to write your own functions. If you use the DirectShow BaseClasses there is also a macro exported in the directshow baseclasses in the file RefTime.h
This would also do the conversion:
double time = 1000;
REFERENCE_TIME direct_show_time = MILLISECONDS_TO_100NS_UNITS(time);