Threadsafe
1+1 = 1
If a sprintf function use a global buffer to create the string it will not work well when two threads use it at the same time. They will write to the same data area and the result string returned will depend on what the other thread is doing. Handling access to shared resources is the basic problem of multi-threaded programming and code that is designed to handle that is called thread safe.
Code is thread safe if:
All the shared-data between the threads is read-only.
Shared-data that can be written is synchronized so only on thread can interact with it at the same time.
Access to shared-data is synchronized by using synchronization primitives. To use some shared-data lock it's synchronization primitive, do the work on the shared data and then unlock the primitive. If any other thread try to acquire the lock when some thread 'hold the lock' then the tread that try to acquire it will stall and wait in the lock function. It will stay there until it gets the lock.
Lock-Free Programming
Lock-free programming is not free from locks, it's goal is to avoid 'holding the lock'.
Lock-free Data Structures - 2014
1 — Introduction
2 - Lock-free Data Structures. Basics: Atomicity and Atomic Primitives
3 - Lock-free Data Structures. Memory Model.
4 - Lock-free Data Structures. The Inside. Memory Management Schemes
5 - Lock-free Data Structures. The Inside. RCU
6 - Lock-Free Data Structures. The Evolution of a Stack
7 - Lock-Free Data Structures. Yet Another Treatise
8 - Lock-Free Data Structures. Exploring Queues
A Fast Lock-Free Queue for C++ - 2014
A Fast General Purpose Lock-Free Queue for C++ - 2014
Detailed Design of a Lock-Free Queue - 2014
Solving the ABA Problem for Lock-Free Free Lists - 2014
A Lock-Free... Linear Search? - 2013
The World's Simplest Lock-Free Hash Table - 2013
An Introduction to Lock-Free Programming - 2012
Memory Ordering at Compile Time - 2012
Memory Barriers Are Like Source Control Operations - 2012
Weak vs. Strong Memory Models - 2012
Implementing Dekker's algorithm with Fences - 2010
Definitions of Non-blocking, Lock-free and Wait-free - 2010
Locks Aren’t Slow; Lock Contention Is - 2011
Lockless Programming Considerations for Xbox 360 and Microsoft Windows - 2008
Common problems
Locks Aren't Slow; Lock Contention Is - 2011
Data Races at the Processor Level - 2011
Race condition
When the outcome of a thread can be changed depending on the timing of other threads there is a race condition.
Deadlocks
A deadlock is when two or more threads block each other from progressing by having a lock on a shared resource. Say there are a two threads (T1 and T2) and T1 try to move something from A to B and the T2 from B to A. T1 get a lock on A and pick out the item and at the same time T2 get a lock on B and pick up his item. When T1 try to get a lock on B it has to wait and at the same time T2 try to lock A and also starts to wait. They will now both wait until the power of the computer runs out.
Resource starvation
If a thread never gets the resources it needs it will stall and fail to progress. This can happen if a thread T1 wish to get access to a resources but other threads with higher priority always takes it first.
False Sharing
False sharing is a cache problem when two or more threads use data close to each other that are on the same cache line. As one thread change the data it will be synced with the other thread. That will slow things down even if they do not use the same data.
CPU Caches and Why You Care - 2010. Video.
Eliminate False Sharing - 2009
False Sharing hits again! - 2008
Links