A guide to setting up Laravel Queue processing on shared servers
There are already quite a few guides on how to setup laravel queues on shared hosting, so you probably have seen or already have a way of working with queues on shared hosting. If you have, this guide follows that same train of thought but with a more personalised view.
For those who squirmed at the sight of the words “Laravel” and “Shared Hosting” being used in the same sentence, this probably isn’t for you… until it is.
TL;WR
This guide is somewhat of a tale, if you’re too lazy to follow the train of thought that led to this piece, you can quickly skip everything and go straight to the solution at the bottom of the page.
Train of Thought
So I found myself hosting a laravel app on shared hosting, and as with a lot of things I’ve noticed while hosting laravel apps on shared servers, setting up Queue processing effectively can be a pain in the butt. My app was all done and ready for production but I had one issue, processing my jobs on queue. I couldn’t possibly use the sync driver or use the sendmethod instead of queue method on my mailables because then, my jobs would be processed synchronously and I don’t have to tell you why that’s bad for business.
I had one other option, setup queue processing. I quickly googled how to setup laravel queue on shared hosting because all the options I’ve tried had not been successful. There were a few hits, this particular guide (Using laravel queues on shared hosting) looked promising so I decided to implement it. But there was a problem, the guide instructs to add laravel scheduler as a cron entry that runs every minute. That was a dead-end for me, my shared hosting provider restricted cron jobs to be run once per 15 minutes. Ouch! If you run the laravel scheduler anything other than once per minute, you end up with the “No scheduled commands are ready to run” message.
Another issue with that solution is that, it’ll keep adding new queue workers in memory and queue workers are long-lived processes, over time, they start pilling up and clogging your memory. I noticed this when I decided to apply the above solution with a slight modification. I just added a cron entry that ran every 15 minutes 😏 that processes my queue:
\*/15 \* \* \* \* \* /path/to/php /path/to/artisan queue:work --tries=3
Well, that was disappointing, just when I thought I’d had a breakthrough. I wasn’t going to quit though, anything is better than running synchronous queue processing. My next endeavour was to get Supervisor. Setting up supervisor on shared hosting wouldn’t be easy, I needed root access that I did not have. I looked around and found a great guide to Setup Supervisor on shared hosting. This process essentially lets you install supervisor to a directory where you have full access, your neck of the woods if you will. You can install supervisor on your project directory and run it from there.
Finally, I was home free and I can finally just configure supervisor and let it handle restarting my queue workers. Oh, wait, my shared server doesn’t have easy_install python module and running python setup.py install command didn’t work either because of the said permission issue I thought I had escaped. I take nothing away from that guide though, it’s a great way to get supervisor installed on your shared server, which also provided me with insight on installing packages to directories where one has full access (another adventure). Give that a try, it just might work for you.
Ok, so I was running out of options before I stumbled upon the — once option provided by Laravel’s queue:work command See the documentation. I quickly jumped on it and yeah, it worked, I just added the following to my crontab and my jobs were being processed without leaving workers to clog my precious memory:
\*/15 \* \* \* \* \* /path/to/php /path/to/artisan queue:work --tries=3 --once
Now my jobs are being processed and all is fine and dandy except the — once option only processes one job at a time. Bummer! Normally, you’d want the queue worker to process all available jobs on the queue so you don’t have jobs lying around for longer periods of time. Let’s say by some miracle you get 1000 jobs on your queue, yeah, I know, if you’d be getting a thousand jobs on your queue over a period of 15 minutes, what are you doing hosting on shared servers. Humour me for a second and say you get a thousand jobs on the queue. The — once option would only be processing one job every 15 minutes, that’s 4 jobs per hour and 1000 jobs per 10 days. Imagine registering for a service and getting a welcome email 10 days later.
This led me to look for a solution that would let me run all my jobs every time the queue worker runs, wouldn’t leave the worker process running so they would start pilling up in memory, and let me have a good sleep at night. That last part is pretty important to me. Not leaving the worker process running means I had to find a way to not run the queue worker in daemon mode. As far as I could tell, laravel didn’t provide that option, it’s either you run it with the — once option or it’s running in daemon mode, like a binary dictator.
Thanks to the extensibility of laravel, I quickly wrote a console command, which I’ve now packaged and made available via composer (queueworker/sansdaemon), that extended the original queue:work command. This meant that the original laravel command still works but with extra options which I so much craved for. With this command, I could just add the queue worker to cron and let it run. It would run all my jobs and exit the worker process — not run in daemon mode. The command also provides an optional — jobs options that specifies the number of jobs you’d like to run every time the queue worker runs.
\*/15 \* \* \* \* \* /path/to/php /path/to/artisan queue:work --sansdaemon --tries=3
The package is available on GitHub and instructions on how to use it is also provided there. Leave a star if you can. 😉
PS: Try as much as possible to avoid hosting laravel apps on shared servers, trust me, you’ll sleep better at night.
TL;DR
The following provides working solutions to processing laravel queues on shared servers.
Run laravel scheduler that processes your queue every minute. (Using laravel queues on shared hosting)
Setup Supervisor on shared hosting and configure it to manage laravel queue workers.
Install this laravel package that extends laravel queue worker and provides a way to run your queue without putting the workers in daemon mode. Just add the following cron job to your cron tab after installing the package.
[cron schedule] /path/to/php /path/to/artisan queue:work --sansdaemon --tries=3
Top comments (1)
You explained the process very smoothly, giving relevant information. Have you worked on managed cloud hosting. I would love to see your work on that too.