DEV Community

Cover image for Day 29:๐ŸŒ Navigating Shared State Concurrency in Rust:- Sync, Send, and Atomicย Wonders
Aniket Botre
Aniket Botre

Posted on

Day 29:๐ŸŒ Navigating Shared State Concurrency in Rust:- Sync, Send, and Atomicย Wonders

Welcome back to the Rust realm, fellow code adventurers! On Day 29 of our #100DaysOfCode challenge, we're taking a plunge into the intricacies of shared state concurrency. Get ready to witness the magic of Mutex, Arc, sync, send traits, and a dash of atomic operations. It's like a digital ballet, and we've got front-row seats!


Intro to Shared State Concurrency ๐Ÿงฉ๐Ÿ”€

In the world of concurrent programming, sharing is not always caring. It can often lead to race conditions, which are about as fun as they sound. So, how do we share data amongst threads without stepping on each other's toes? Enter Shared State Concurrency.

Shared State Concurrency is like the United Nations of your program. It provides a safe space for threads to share data without stepping on each other's toes (or data). We achieve this harmony using two key players: Mutex and Arc - the Batman and Robin of concurrency control. ๐Ÿฆธโ€โ™‚๏ธ๐Ÿฆธโ€โ™‚๏ธ

Mutex Magic: Ensuring Exclusive Access ๐Ÿ”

A Mutex, or mutual exclusion, is like the bouncer at a clubโ€Š-โ€Šonly one thread can access the data at a time.

Here's an example of Mutex in action:

use std::sync::Mutex;

fn main() {
    let m = Mutex::new(5);

    {
        let mut num = m.lock().unwrap();
        *num = 6;
    }

    println!("Value of m is {:?}", m);
    // Output: Value of m is Mutex { data: 5, poisoned: false, .. }
}
Enter fullscreen mode Exit fullscreen mode

In this example, we create a mutex and store it in a variable named m. We then create a new scope with curly braces. Inside the block, we create a new variable named num and assign it the result of calling the lock method on the mutex. The lock method returns a MutexGuard smart pointer. The MutexGuard smart pointer implements the Deref and Drop traits. The Deref trait allows us to access the data inside the mutex. The Drop trait allows us to release the mutex when the MutexGuard goes out of scope.

Arc - The Protector of Shared Data ๐Ÿ”

But wait, there's more! Arc (Atomic Reference Counting) swoops in when Mutex needs a sidekick. It's like having a superhero duo, ensuring shared ownership without compromising safety. Watch them in action:

use std::sync::{Arc, Mutex};
use std::thread;

let data = Arc::new(Mutex::new(5));

for _ in 0..10 {
    let data = Arc::clone(&data);
    thread::spawn(move || {
        let mut data = data.lock().unwrap();
        *data += 1;
    });
}

thread::sleep(Duration::from_millis(50));
println!("Result is {}", *data.lock().unwrap());
// Output: Result is 15
Enter fullscreen mode Exit fullscreen mode

In this example, we create a shared data of 5 wrapped in Arc and Mutex. We then spawn 10 threads that increment the data. The lock().unwrap() ensures that the data is accessed by one thread at a time. The Arc::clone(&data) ensures that the data lives for the duration of all threads. Arc ensures that multiple threads can increment the counter without stepping on each other's toes. The result printed will always be 15 proving that our data is safe from race conditions. ๐Ÿ๐Ÿ”’


Extensible Concurrency with Sync and Send Traits ๐Ÿงฌ๐Ÿš€

We've seen Batman and Robin, but what about the Justice League? The Sync and Send traits in Rust allow for extensible concurrency. They're the superheroes that ensure our code is safe for multithreading.

The Send trait indicates that ownership of the value can be transferred between threads. The Sync trait indicates that a value can be safely shared between threads by reference.

Here's the catch though, not all superheroes wear capes. Some are invisible. In Rust, many types are Send and Sync by default, so you don't always see them, but they're there, protecting your code from the shadows. ๐Ÿ‘ค๐Ÿฆธโ€โ™‚๏ธ

The Send Trait

The Send trait indicates that ownership of the type implementing this trait can be transferred between threads. Most types in Rust are Send, but there are exceptions, such as Rc<T>, Rust's reference-counted pointer type, which is not thread-safe.

The Sync Trait

The Sync trait indicates that it is safe for the type implementing this trait to be referenced from multiple threads. In other words, any type T is Sync if &T (a reference to T) is Send, meaning it's safe to send a reference to another thread.

Here's an example of using Arc<T>, a thread-safe reference-counted pointer type, to share state between threads:

use std::sync::Arc;
use std::thread;

#[derive(Debug)]
struct SharedData {
    value: Arc<String>,
}

fn main() {
    // Create a shared value wrapped in an Arc
    let shared_value = Arc::new("Rust is amazing!".to_string());

    // Create an instance of SharedData containing the shared value
    let data = SharedData { value: shared_value };

    // Spawn a new thread and move the data into the closure
    let handle = thread::spawn(move || {
        // Print the data, which includes the shared value
        println!("{:?}", data);
    });

    // Wait for the thread to finish
    handle.join().unwrap();
}

// Output: SharedData { value: "Rust is amazing!" }
Enter fullscreen mode Exit fullscreen mode

The Send trait is utilized implicitly here, as the Arc type is Send, allowing it to be safely transferred between threads. Additionally, the Sync trait comes into play as Arc ensures safe multiple-thread access to the shared data, further enforcing memory safety and preventing concurrent data corruption. Overall, the code demonstrates the safe sharing of immutable data between threads using Arc, and the implicit usage of the Send and Sync traits ensures that the shared data is handled safely and efficiently across threads.


Atomic Operations: The Unsung Heroes ๐Ÿฆธโ€โ™€๏ธโšก

If threads were a rock band, atomic operations would be the drummer. They're in the background, keeping the beat and ensuring everything runs smoothly. Atomic operations are operations that run completely independently of any other operations. They're like the honey badger - they don't care about what other operations are doing. ๐Ÿฆก๐Ÿ’โ€โ™‚๏ธ

Rust provides atomic types in the std::sync::atomic module. These types include AtomicBool, AtomicIsize, AtomicUsize, and more, which correspond to the primitive data types but ensure atomic access.

Here's a brief rundown of some atomic operations:

  • storeโ€Š-โ€ŠThis operation is like the mailman. It delivers a new value to an atomic variable. ๐Ÿ“ฌ
use std::sync::atomic::{AtomicUsize, Ordering};

fn main() {
    // Initialize an AtomicUsize with an initial value of 0
    let atomic_value = AtomicUsize::new(0);

    // Store the value 42 into atomic_value with Relaxed ordering
    atomic_value.store(42, Ordering::Relaxed);

    // Print the stored value using Relaxed ordering
    println!("Stored value is {}", atomic_value.load(Ordering::Relaxed));
    // Output: Stored value is 42
}
Enter fullscreen mode Exit fullscreen mode

The store method in the provided code atomically replaces the value in the AtomicUsize variable with the given value (42 in this case) using relaxed memory ordering. This operation ensures that the update is performed atomically, preventing data races and ensuring that the new value is immediately visible to other threads accessing the variable.

  • loadย : This operation is the nosy neighbor. It reads the current value of an atomic variable. ๐Ÿ‘€
use std::sync::atomic::{AtomicUsize, Ordering};

fn main() {
    // Initialize an AtomicUsize with an initial value of 42
    let atomic_value = AtomicUsize::new(42);

    // Load the value from atomic_value using Relaxed ordering
    let loaded_value = atomic_value.load(Ordering::Relaxed);

    // Print the loaded value
    println!("Loaded value is {}", atomic_value.load(Ordering::Relaxed));
    // Output: Loaded value is 42
}
Enter fullscreen mode Exit fullscreen mode

The load method atomically retrieves the value stored in an AtomicUsize variable, ensuring that the read operation is performed without data races and reflects the most recent update to the variable.

  • swapย : This operation is the switcheroo. It swaps the current value of an atomic variable with a new one. ๐Ÿ”„
use std::sync::atomic::{AtomicUsize, Ordering};

fn main() {
    // Initialize an AtomicUsize with an initial value of 42
    let atomic_value = AtomicUsize::new(42);

    // Atomically swap the value in atomic_value with 23 using Relaxed ordering, and retrieve the previous value
    let previous_value = atomic_value.swap(23, Ordering::Relaxed);

    // Print the swapped value and the previous value
    println!("Swapped value is {} (Previous value was {})", atomic_value.load(Ordering::Relaxed), previous_value);
    // Output: Swapped value is 23 (Previous value was 42)
}
Enter fullscreen mode Exit fullscreen mode

The swap method with Ordering::Relaxed is then used to atomically swap the value with 23 and retrieve the previous value. Finally, the swapped value and the previous value are printed. The output will be: "Swapped value is 23 (Previous value was 42)".

  • fetch_addย : This operation is the overachiever. It adds to the current value and returns the previous value. ๐Ÿ‹๏ธโ€โ™‚๏ธ
use std::sync::atomic::{AtomicUsize, Ordering};

fn main() {
    // Initialize an AtomicUsize with an initial value of 42
    let atomic_value = AtomicUsize::new(42);

    // Atomically add 10 to the value in atomic_value using Relaxed ordering, and retrieve the previous value
    let new_value = atomic_value.fetch_add(10, Ordering::Relaxed);

    // Print the new value
    println!("New Value {}", new_value);
    // Output: New value is 42
}
Enter fullscreen mode Exit fullscreen mode

In this code snippet, an AtomicUsize named atomic_value is initialized with an initial value of 42. The fetch_add method with Ordering::Relaxed is then used to atomically add 10 to the value and retrieve the previous value. Finally, the new value is printed. The output will be: "New value is 42". The fetch_add method ensures that the addition operation is performed atomically, preventing data races and ensuring the integrity of the shared data.


Conclusion: The Grand Finale ๐ŸŽ‰

And there we have it! We've dived deep into the ocean of Shared State Concurrency, swam with Mutex and Arc, sailed with Sync and Send traits, and witnessed the magic of Atomic Operations.

Remember, concurrency is a tricky beast to tame. It's like playing a game of chess where all the pieces move at once. But with the right tools, and a good dose of humor, you can master it! So, keep diving, keep exploring, and keep coding! ๐Ÿ‘จโ€๐Ÿ’ป๐Ÿ‘ฉโ€๐Ÿ’ป๐ŸŒŠ

Tomorrow, brace yourselves as we step into the grand finale: Futures and Async/Await in Rust. Get ready for the showstopper! ๐Ÿ’ป๐ŸŽญย 

RustLang #ConcurrencyMagic #SharedStateConcurrency #100DaysOfRust

Top comments (0)