This content originally appeared on DEV Community and was authored by Lucas Godoy
Hey mates!
Just a quick one I like to talk about! Rate limiting goroutines.
This is about controlling the actual amount of concurrent task executions.
Sometimes, we have to process and execute a stream of long-running tasks and we don't know at runtime how many of them are coming out from the task channel. So here, the main concern is not firing all the goroutines together, at the time tasks are ingested. Otherwise, firing many of them concurrently uncontrolled could lead to unpredicted behaviors or memory overflow.
Therefore, a limiter
(AKA semaphore) empty struct buffered channel has been added into the mix, capped with the count
of tasks to run concurrently.
- For the first n-
count
iteration an empty struct will be pushed onto thelimiter
channel, - A goroutine is fired up to run the incoming task.
- At the n-
count + 1
iteration, thelimiter
channel will be full, hence the currentmain
goroutine will be blocked. - Once any of the currently running tasks finish its execution, it will readout of the
limiter
channel, to make some room for another task to be run. This will unblock themain
goroutine. - After the
main
goroutine takes the control back, it will push an empty struct onto thelimiter
channel and start over the cycle by running a new goroutine for the incoming task.
And so on until the time out is reached, so the for loop brakes and no more tasks are run.
To sum up, this is how we can limit goroutines to come up all together by controlling how many of them could be up and running concurrently by using a limiter
buffered channel for it. Avoiding causing a memory overflow and unpredicted behavior.
This content originally appeared on DEV Community and was authored by Lucas Godoy
Lucas Godoy | Sciencx (2021-07-11T14:12:22+00:00) Rate limiting your goroutines. Retrieved from https://www.scien.cx/2021/07/11/rate-limiting-your-goroutines/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.