Today, in the analytics system of Seofy, I found a decrease in conversion from visiting the main page of the site to adding a site, so I began to investigate the reasons. I tried to add the site myself and hung up waiting for the server to respond and redirect to the next page of the user path funnel. Waited for about 30 seconds.
The culprit was Guzzle, who sent a request to an external resource and was waiting for a response, and the external resource was loaded so much that he had no time to respond.
As a result, I transferred this Guzzle request and Telegram alerts to the Laravel Model Observer, rewriting the code as follows:
exec('curl -s -X POST https://api.telegram.org/bot'.env('TELEGRAM_BOT_TOKEN').'/sendMessage -d chat_id='.env('TELEGRAM_GROUP_ID').' -d text="'.$text_to_send.'" --connect-timeout 5 > /dev/null 2>&1 &');
exec('curl -X POST http://'.env('BOT_ADDRESS','localhost').':8051/domains/'.$domain->id.'/start --connect-timeout 5 > /dev/null 2>&1 &');
Now this logic is launched from the command line, without waiting for the result.
I wonder what you think about it and how you would act in this situation.
Top comments (4)
Right, so the principle of "make things asynchronous if you can", move them out of the request/response pipeline. The usage of exec/curl is a bit of an unexpected one but then again, yeah why not, since it's PHP this is probably the most simple way to make it async.
I would actually use the Laravel Queue, which would allow me to actually track whether the request completed and how long it took.
Absolutely, that's a much more complete and solid solution ... the usage of exec/curl is more a quick & dirty hack, but maybe it's good enough for something non-critical.
Finally, I move this code to Laravel Queues and it's the first time I use em for the six years of using Laravel :) Thanks for it!