Concurrency in Go is primarily built around "goroutines" and channels, which make handling complex systems both easy (well, easier) and more efficient. What follows is an overview of key concurrency patterns in Go, with some simple examples to illustrate the concepts.
Contents
- Introduction
- What are Goroutines?
- What Makes Goroutines Special?
- Communication Patterns using Channels
- Conclusion
What are Goroutines?
Goroutines are lightweight threads managed by the Go runtime. They are used to run functions concurrently. You start a goroutine by prefixing a function call with the go
keyword. This allows the function to run independently in a new goroutine.
package main
import (
"fmt"
"time"
)
func myAsyncFunction() {
for a := 0; a < 10; a++ {
// do something intersting
fmt.Println(a)
}
}
func main() {
go myAsyncFunction() // <-- spawns the goroutine
time.Sleep(time.Second) // Wait for the goroutine to finish
}
In this example, we create a new goroutine by calling go myAsyncFunction()
. The myAsyncFunction
function will run concurrently with the main goroutine.
We're using a simple time.Sleep
to wait for the goroutine to finish before the program exits, otherwise it won't actually output anything because the program would finish before the myAsyncFunction
goroutine had a chancee to execute. In a real-world application, you would use something a little more sophisticated method to ensure that the main goroutine has finished executing, such as a sync.WaitGroup
or a channel, as we'll see later.
What Makes Goroutines Special?
Goroutines are more cost-effective than traditional threads, allowing you to spawn thousands or even millions of concurrent operations.
Low Footprint
Traditional operating system (OS) threads can require a significant amount of memory per thread, often in the range of megabytes. But goroutines start with a much smaller size, normally in the region a few kilobytes. This smaller size means that the Go runtime can create many more goroutines on the same amount of memory compared to traditional threads.
Dynamic Stack Allocation
This is a really cool feature of goroutines. Basically, the Go runtime will automatically grow and shrink the stack size of goroutines as needed. This means that goroutines use only as much memory as required for their execution, without preallocating large amounts of memory. You can imagine that this really makes a difference when you're running thousands of concurrent operations.
Communication Patterns using Channels
So you have all your processes neatly packaged away in their own goroutines, but how do you get them to talk to each other? Channels are the pipes that connect concurrent goroutines, allowing them to communicate with each other and bring order to the chaos of concurrent programming.
Next, we'll look at some different patterns for using channels to synchronise and communicate between goroutines.
One-to-One (Producer-Consumer Pattern)
Imagine a channel as a stack of data. Goroutines can push data onto the stack, and other goroutines can pop data off the stack. Let's see what that looks like in code:
package main
import "fmt"
func fortyTwoProducer(ch chan int) {
ch <- 42 // Send a value into the channel
}
func main() {
ch := make(chan int) // Create a new channel
go fortyTwoProducer(ch) // Start the producer goroutine
value := <-ch // Receive a value from the channel
fmt.Println(value)
}
In this example, we create a new channel with make(chan int)
. We then start a goroutine that sends the value 42
into the channel. The main goroutine then receives the value from the channel and prints it.
This is pretty cool, but what if we want to do asynchronous work in the goroutine? We need to setup a way to "listen" to that channel. Let's create a new goroutine that acts as a receiver. And we'll change our fortyTwoProducer
to send the values from 0 to 9.
package main
import "fmt"
func producer(ch chan int) {
for a := 0; a < 10; a++ {
ch <- a
}
close(ch)
}
func receiver(ch chan int) {
for v := range ch {
fmt.Println(v)
}
}
func main() {
ch := make(chan int) // Create a new channel
go producer(ch) // Start the producer goroutine
receiver(ch)
}
If I've done that correctly, this will give the following output:
0
1
2
3
4
5
6
7
8
9
What's going on here is quite interesting. This is, by default, an unbuffered channel. This literally means that there is no buffer to hold the values of the channel. Therefore, when the producer sends a value into the channel, it has to be received immediately by a receiver. If there is no receiver, the producer will block until there is a receiver. And the producer cannot send another value until the first value has been received.
We can also create a buffered channel, which will allow the producer to send multiple values into the channel without blocking. One application of this might be rate limiting, effectively ensuring that a fast producer cannot overwhelm a slow receiver. Let's see what that looks like:
package main
import "fmt"
func producer(ch chan int) {
for a := 0; a < 10; a++ {
fmt.Println("Sending: ", a)
ch <- a
}
close(ch)
}
func receiver(ch chan int) {
for v := range ch {
fmt.Println("Receiving: ", v)
}
}
func main() {
ch := make(chan int, 5) // Create a new buffered channel
go producer(ch) // Start the producer goroutine
receiver(ch)
}
Which gives us this output:
Sending: 0
Sending: 1
Sending: 2
Sending: 3
Sending: 4
Sending: 5
Sending: 6
Receiving: 0
Receiving: 1
Receiving: 2
Receiving: 3
Receiving: 4
Receiving: 5
Receiving: 6
Sending: 7
Sending: 8
Sending: 9
Receiving: 7
Receiving: 8
Receiving: 9
I'm sure you're wondering why it's not giving the output in the order that we sent the values. It should have given us 0-4, then waited until at least one was received before continuing, but it didn't. This is because the exact order in which the goroutines execute is determined by the Go runtime and in this case the producer produced extra values. This is a great example of how concurrency can lead to non-deterministic behaviour.
Channels aren't just for sending and receiving values between a pair goroutines. They can be used to connect multiple goroutines in various configurations, not limited to one-to-one communication. Here are some common patterns:
One-to-Many (Fan-out)
A single goroutine sends messages on a channel that multiple goroutines are reading from. Each message sent is received by only one of the listening goroutines, effectively distributing the messages among multiple workers. This pattern is useful for parallelising work.
To demonstrate this, we'll create three more receiver goroutines, all listening to the same channel.
package main
import (
"fmt"
"time"
)
func producer(ch chan int) {
for a := 0; a < 10; a++ {
fmt.Println("Sending: ", a)
ch <- a
}
close(ch)
}
func receiverOne(ch chan int) {
for v := range ch {
fmt.Println("One receiving: ", v)
}
}
func receiverTwo(ch chan int) {
for v := range ch {
fmt.Println("Two receiving: ", v)
}
}
func receiverThree(ch chan int) {
for v := range ch {
fmt.Println("Three receiving: ", v)
}
}
func receiverFour(ch chan int) {
for v := range ch {
fmt.Println("Four receiving: ", v)
}
}
func main() {
ch := make(chan int)
go producer(ch)
go receiverOne(ch)
go receiverTwo(ch)
go receiverThree(ch)
go receiverFour(ch)
time.Sleep(time.Second)
}
Note how we're executing the recievers as separate goroutines this time. If we didn't do that, receiverOne
would block until it had received all the values.
This will give the the following output:
Sending: 0
Sending: 1
Sending: 2
Sending: 3
Sending: 4
One receiving: 0
Four receiving: 2
Four receiving: 4
Two receiving: 1
Sending: 5
Sending: 6
Sending: 7
Three receiving: 3
Three receiving: 7
Four receiving: 6
One receiving: 5
Sending: 8
Sending: 9
Two receiving: 8
Three receiving: 9
Many-to-One (Fan-in)
Multiple goroutines send messages to a single channel, which is read by a single goroutine. This pattern is often used to aggregate results from multiple concurrent operations.
package main
import (
"fmt"
"time"
)
func producerOne(ch chan int) {
ch <- 1
}
func producerTwo(ch chan int) {
ch <- 2
}
func producerThree(ch chan int) {
ch <- 3
}
func producerFour(ch chan int) {
ch <- 4
}
func receiver(ch chan int) {
for v := range ch {
fmt.Println("Receiving: ", v)
}
}
func main() {
ch := make(chan int)
go producerOne(ch)
go producerTwo(ch)
go producerThree(ch)
go producerFour(ch)
go receiver(ch)
time.Sleep(time.Second)
close(ch)
}
Giving the following output:
Receiving: 1
Receiving: 3
Receiving: 2
Receiving: 4
Pipeline
Channels can be used to create a pipeline of goroutines, where each goroutine in the pipeline processes data and passes it to the next stage via a channel.
package main
import (
"fmt"
"time"
)
func producer(ch chan int) {
for a := 0; a < 10; a++ {
fmt.Println("Sending: ", a)
ch <- a
}
close(ch)
}
func receiver(ch chan int) {
for v := range ch {
fmt.Println("Receiving: ", v)
}
}
func doubler(in <-chan int, out chan<- int) {
for v := range in {
out <- v * 2
}
close(out)
}
func main() {
ch := make(chan int)
go producer(ch)
ch2 := make(chan int)
go doubler(ch, ch2)
go receiver(ch2)
time.Sleep(time.Second)
}
This is similar to the one-to-one example, but here we've created a second channel ch2
to connect the producer and receiver. We've also added a doubler
function that reads from ch
and writes to ch2
. So now we've got a pipeline where the producer sends values to the doubler, which then sends the doubled values to the receiver, outputting:
Sending: 0
Sending: 1
Sending: 2
Receiving: 0
Receiving: 2
Receiving: 4
Sending: 3
Sending: 4
Sending: 5
Receiving: 6
Receiving: 8
Receiving: 10
Sending: 6
Sending: 7
Sending: 8
Receiving: 12
Receiving: 14
Receiving: 16
Sending: 9
Receiving: 18
Wow, I'm blown away by how easy it is to create an asynchronous pipeline in Go. Although this example looks simple, it's hiding a huge amount of complexity. Go is ensuring that we're able to handle asynchronous input safely and efficiently. Go Go!
Conclusion
This was just a brief introduction to how goroutines can communicate with each other via channels. There are many more patterns and techniques that can be used to build concurrent programs in Go, and concurrency is a world of pain fun and learning.
I hope you've enjoyed reading this, if you have any questions or comments, please feel free to reach out to me on Twitter/X. I'd love to hear from you.
Top comments (0)