Delphi creating thread safe writing functions - delphi

I'm using Delphi XE in order to create a multi thread application.
I want to create a thread to handle all the writing into a firebird/sqlite database log.
And i want to create several threads that do some jobs and these other threads need to use the writing log thread when they need.
Thread1 = writing log thread
Thread2 = do some math and from time to time use Thread1 to write log.
Thread3 = do some other stuff and from time to time use Thread1 to write log.
and so on
for simplicity I have created a method in thread1 named WriteCollectionLog that all the other thread need to use to write the log into thread1 memory (a collection) and thread1 "onexecute" will handle the real log write into the database. This method is intended to be used like a "fire and forget" method.
now how do i make this thread safe? or can I make it thread safe? (by using TCriticalSection?)
thread2.WriteCollectionLog ...
thread3.WriteCollectionLog ...
procedure Thread1.WriteCollectionLog(aIDWORKFLOW : Integer);
var workItem : TLogFIREBIRD_Item;
begin
try
readWriteCriticalSection.Acquire; <--- this will suspend the calling thread .. like thread2, thread3 and not the thread1?
do stuff;
finally
readWriteCriticalSection.Release;
end;
end;
Regards
Razvan

Just implement what you wrote: only write to the DB from Thread1.
Please don't expect to "call" one thread from another, as you may using "Synchronize" to run some code in the main thread. It would be blocking, which is not what you expect, I guess.
The idea is to have a small in-memory structure, shared between the threads, protected by a critical section. One goal is to keep any critical section as small as possible.
For instance, you may use a TObjectQueue, then Thread2 and Thread3 will push some data to the queue, and Thread1 will wait for some pending data in the Queue, then unqueue it and write it - and if you define a transaction, it will be faster than a naive blocking process. If you just want to write some log content, use a TArray<string>, with an associated lock.

Related

Does async operation in iOS create a new thread internally, and allocate task to it?

Does async operation in iOS, internally create a new thread, and allocate task to it ?
An async operation is capable to internally create a new thread and allocate task to it. But in order for this to happen you need to run an async operation which creates a new thread and allocates task to it. Or in other words: There is no direct correlation.
I assume that by async you mean something like DispatchQueue.main.async { <#code here#> }. This does not create a new thread as main thread should already be present. How and why does this work can be (if oversimplified) explained with an array of operations and an endless loop which is basically what RunLoop is there for. Imagine the following:
Array<Operations> allOperations;
int main() {
bool continueRunning = true;
for(;continueRunning;) {
allOperations.forEach { $0.run(); }
allOperations.clear();
}
return 0;
}
And when you call something like DispatchQueue.main.async it basically creates a new operation and inserts it into allOperations. The same thread will eventually go into a new loop (within for-loop) and call your operation asynchronously. Again keep in mind that this is all over-simplified just to illustrate the idea behind all of it. You can from this also imagine how for instance timers work; the operation will evaluate if current time is greater then the one of next scheduled execution and if so it will trigger the operation on timer. That is also why timers can not be very precise since they depend on rest of execution and thread may be busy.
A new thread on the other hand may be spawned when you create a new queue DispatchQueue(label: "Will most likely run on a new thread"). When(if) exactly will a thread be made is not something that needs to be fixed. It may vary from implementations and systems being run on. The tool will only guarantee to perform what it is designed for but not how it will do it.
And then there is also Thread class which can generate a new thread. But the deal is same as for previous one; it might internally instantly create a new thread or it might do it later, lazily. All it guarantees is that it will work for it's public interface.
I am not saying that these things change over time, implementation or system they run on. I am only saying that they potentially could and they might have had.

What does it mean for something to be thread safe in iOS?

I often come across the key terms "thread safe" and wonder what it means. For example, in Firebase or Realm, some objects are considered "Thread Safe". What exactly does it mean for something to be thread safe?
Thread Unsafe
- If any object is allowed to modify by more than one thread at the same time.
Thread Safe
- If any object is not allowed to modify by more than one thread at the same time.
Generally, immutable objects are thread-safe.
An object is said to be thread safe if more than one thread can call methods or access the object's member data without any issues; an "issue" broadly being defined as being a departure from the behaviour when only accessed from only one thread.
For example an object that contains the code i = i + 1 for a regular integer i would not be thread safe, since two threads might encounter that statement and one thread might read the original value of i, increment it, then write back that single incremented value; all at the same time as another thread. In that way, i would be incremented only once, where it ought to have been incremented twice.
After searching for the answer, I got the following from this website:
Thread safe code can be safely called from multiple threads or concurrent tasks without causing any problems (data corruption, crashing, etc). Code that is not thread safe must only be run in one context at a time. An example of thread safe code is let a = ["thread-safe"]. This array is read-only and you can use it from multiple threads at the same time without issue. On the other hand, an array declared with var a = ["thread-unsafe"] is mutable and can be modified. That means it’s not thread-safe since several threads can access and modify the array at the same time with unpredictable results. Variables and data structures that are mutable and not inherently thread-safe should only be accessed from one thread at a time.
iOS Thread safe
[Atomicity, Visibility, Ordering]
[General lock, mutex, semaphore]
Thread safe means that your program works as expected. It is about multithreading envirompment, where we have a problem with shared resource with problems of Data races and Race Condition[About].
Apple provides us by Synchronization Tools:
Atomicity
Atomic Operations - lock free mechanism which is based on hardware instructions - for example Compare-And-Swap(CAS)[More]...
Objective-C OSAtomic, atomic property attribute[About]
[Swift Atomic Operations]
Visibility
Volatile Variable - read value from memory(no cache)
Objective-C volatile
Ordering
Memory Barriers - guarantees up-to date data[About]
Objective-C OSMemoryBarrier
Find problem in your code
Thread Sanitizer - uses self.recordAndCheckWrite(var) inside to figure out when(timestamp) and who(thread) access to variable
Synchronisation
Locks - thread can get a lock and nobody else access to the resource. NSLock.
Semaphore consists of Threads queue, Counter value and has wait() and signal() api. Semaphore allows a number of threads(Counter value) work with resource at a given moment. DispatchSemaphore, POSIX Semaphore - semaphore_t. App Group allows share POSIX semaphores
Mutex - mutual exclusion, mutually exclusive - is a type of Semaphore(allows several threads) where Thread can acquire it and is able to work with block as a single encroacher, all others thread will be blocked until release. The main different with lock is that mutex also works between processes(not only threads). Also it includes memory barrier.
var lock = os_unfair_lock_s()
os_unfair_lock_lock(&lock)
//critical section
os_unfair_lock_unlock(&lock)
NSLock -POSIX Mutex Lock - pthread_mutex_t, Objective-C #synchronized.
let lock = NSLock()
lock.lock()
//critical section
lock.unlock()
Recursive lock - Lock Reentrance - thread can acquire a lock several times. NSRecursiveLock
Spin lock - waiting thread checks if it can get a lock repeatedly based on polling mechanism. It is useful for small operation. In this case thread is not blocked and expensive operations like context switch is not nedded
[GCD]
Common approach is using custom serial queue with async call - where all access to memory will be done one by one:
serial read and write access
private let queue = DispatchQueue(label: "com.company")
self.queue.async {
//read and write access to shared resource
}
concurrent read and serial write access. when write is oocured - all previous read access finished -> write -> all other reads
private let queue = DispatchQueue(label: "com.company", attributes: .concurrent)
//read
self.queue.sync {
//read
}
//write
self.queue.sync(flags: .barrier) {
//write
}
Operations
[Actors]
actor MyData {
var sharedVariable = "Hello"
}
//using
Task {
await self.myData.sharedVariable = "World"
}
Multi threading:
[Concurrency vs Parallelism]
[Sync vs Async]
[Mutable vs Immutable] [let vs var]
[Swift thread safe Singleton]
[Swift Mutable/Immutable collection]
pthread - POSIX[About] thread
NSThead
To give a simple example. If something is shared across multiple threads without any issues like crash, it is thread-safe. For example, if you have a constant (let value = ["Facebook"]) and it is shared across multiple threads, it is thread safe because it is read-only and cannot be modified. Whereas, if you have a variable (var value = ["Facebook"]), it may cause potential crash or data loss when shared with multiple threads because it's data can be modified.

What happens when ShowMessage is called?

I am working on the project where I communicate with another device over COM port.
For incoming data I am using VaComm1RXchar event, there I store the message into the array and increment msgIndex which represents number of messages.
Then I call the function where I work with this message.
Inside this function is this timeout cycle where I wait for this message:
while MsgIndex < 1 do
begin
stop := GetTickCount;
if (stop - start)> timeout then
begin
MessageBox(0, 'Timeout komunikace !', 'Komunikace', MB_OK);
exit(false);
end;
sleep(10);
end;
The strange thing for me is that, when have it like this above then it always end with timeout. But when I put on there before this while cycle a ShowMessage('Waiting') then It works correcly.
Does anyone know what can caused this and how can I solve it? Thanks in advance!
We can deduce that the VaComm1RXchar event is a synchronous event and that by blocking your program in a loop you are preventing the normal message processing that would allow that event to execute.
Showing a modal dialog box, on the other hand, passes message handling to that dialog so the message queue is properly serviced and your Rx events and handled normally.
You can be certain this is the case if this also works (please never write code like this - it's just to prove the point):
while MsgIndex < 1 do begin
stop := GetTickCount;
if (stop - start)> timeout then begin
MessageBox(0, 'Timeout komunikace !', 'Komunikace', MB_OK);
exit(false);
end;
Application.ProcessMessages; // service the message queue so that
sleep(10); // your Rx event can be handled
end;
If there is a lesson here it is that RS-232 communication really needs to be done on a background thread. Most all implementations of "some chars have been received" events lead to dreadful code for the very reason you are discovering. Your main thread needs to be free to process the message that characters have been received but, at the same time, you must have some parallel process that is waiting for those received characters to complete a cogent instruction. There does not exist a sensible solution in an event-driven program to both manage the user interface and the communication port at the same time on one thread.
A set of components like AsyncPro**, for example, wrap this functionality into data packets where synchronous events are used but where the components manage start and end string (or bytes) detection for you on a worker thread. This removes one level of polling from the main thread (ie: you always get an event when a complete data packet has arrived, not a partial one). Alternatively, you can move the communication work to a custom thread and manage this yourself.
In either case, this is only partly a solution, of course, since you still cannot stick to writing long procedural methods in synchronous event handlers that require waiting for com traffic. The second level of polling, which is managing a sequence of complete instructions, will still have you needing to pump the message queue if your single procedure needs to react to a sequence of more than one comport instruction. What you also need to think about is breaking up your long methods into shorter pieces, each in response to specific device messages.
Alternatively, for heavily procedural process automation, it is also often a good idea to move that work to a background thread as well. This way your worker threads can block on synchronization objects (or poll in busy loops for status updates) while they wait for events from hardware. One thread can manage low level comport traffic, parsing and relaying those commands or data packets, while a second thread can manage the higher level process which is handling the sequence of complete comport instructions that make up your larger process. The main thread should primarily only be responsible for marshalling these messages between the workers, not doing any of the waiting itself.
See also : I do not understand what Application.ProcessMessages in Delphi is doing
** VAComm may also support something like this, I don't know. The API and documentation are not publicly available from TMS for ASync32 so you'll need to consult your local documentation.
What happens is that a modal message loop executes when the dialog is showing. That this message loop changes behaviour for you indicates that your communications with the device require the presence of a message loop.
So the solution for you is to service the message queue. A call to Application.ProcessMessages in your loop will do that, but also creates other problems. Like making your UI become re-entrant.
Without knowing more about your program, I cannot offer more detailed advice as to how you should solve this problem.

abort long running process of a database table : ideas ?

I must process a very long database table an think of the most common way to abort this loop. The principal code sequence goes like this
procedure TForm.ProcessmyTable(Sender : TObject);
begin
.....
myTable.first;
repeat
ReadSingleRecordfromTable ( MyTable, aRecord) ;
ProcessMyRecord(aRecord) ;
MyTable.next;
until MYTable.EOF;
end;
unit .... ;
procedure ProcessMyRecord(aRecord : TMyDataRecord) ;
begin
// do not have user interface stuff here
// Application.Processmessages will not work here !!!
.... ( long running code sequence)
end;
Could do a timer and break the loop based on the timer with var as a flag support .... but is this really the most clever way of solving this issue?
If this code runs in the main thread, then you will need to service the message queue (i.e. call Application.ProcessMessages) if you want the user to interact with your program and abort. In which case I think you already know the solution. Call Application.ProcessMessages. If the user chooses to abort, set a boolean flag and check that flag regularly in the inner-most loop.
Of course, this is all rather messy. The fundamental problem is that you are performing long-running actions on the GUI thread. That's something you should not do. Move the database processing code onto a different thread. If the user chooses to abort, signal to the thread that it is to abort. For example a call to the Terminate method of the thread would be one way to do this.
When do you want to abort? If it takes too long of if the user says 'stop'?
In both cases change your
until MYTable.EOF;
to
until MYTable.EOF or Aborted;
then set your Aborted boolean either when the timer triggers or when the user presses a key (note that you then have to use Application.ProcessMessages in the loop for the program to be able to process the keypress). This will not abort in your processing routine but after each record. If that is not fast enough you will have to show your record processing routine.
If it's such a long process that the user might want to abort surely in a windows app there should be some interaction with the GUI, a count of records done or a progress bar (if not on every record then on a periodic basis) any call to update a label or progressbar will provide the opportunity to set an abort flag (so processmessages is not required). Plus if the user can see some progress they may be less likely to abort when bored but 95% complete.
IMHO :)

How to start a thread into a service in delphi 7, Windows XP?

We need to Start a thread into a service application we developed.
We did in the OnExecute event, and it failed, and later we did in the OnStart event, and it failed again. Maybe we have to do something else to start the thread.
The line of code we only have to type is MonitorThread.Start;
Where and how we can to start the thread??
Thanks.
On the face of it, starting a thread in a service is no different from starting a thread in any other kind of application. Simply instantiate the thread object and let it run. If you created the object in a suspended state, then call Start on it (or, in versions earlier than 2010, Resume).
MonitorThread := TMonitorThread.Create;
MonitorThread.Start; // or MonitorThread.Resume
If that doesn't work, then you need to take a closer look at exactly what doesn't work. Examine exception messages and return codes. Use the debugger to narrow things down.
If it's possible, I advise you to not create the thread suspended. Instead, just provide the object all the parameters it needs in its constructor. Let it initialize itself, and it will start running just before the constructor returns to the caller. No need for additional thread management outside the thread object.

Resources