Imagine you are fetching data from third party API inside your laravel backend. In modern web applications, integrating with third-party APIs has become commonplace. These external services often provide critical functionalities, from fetching user data to processing transactions. However, these APIs can sometimes fail due to various reasons such as network issues, service outages, or rate limits. Such failures can disrupt your application’s operations and negatively impact user experience.
To mitigate these risks, it’s essential to implement robust retry mechanisms that can handle these transient failures effectively. This is where Laravel Jobs come into play. Laravel’s job system allows you to perform background tasks asynchronously, including handling API requests. By using jobs, you can ensure that these tasks are retried automatically when they fail, without blocking the main application flow.
Jobs in Laravel provide a structured way to manage and retry failed tasks, making your application more resilient and reliable. They help offload time-consuming processes from the user’s immediate experience, improving performance and scalability. In this article, we’ll explore how to set up retry logic in Laravel Jobs to handle third-party API failures gracefully, ensuring that your application remains robust and responsive even in the face of external service issues.
Now let's see how to do this,
1.Creating a Job with Retry Logic
First let's create the job with this artisan command
php artisan make:job FetchApiDataJob
This is the retry logic.
public $tries = 5; // Number of retry attempts
public $backoff = 10; // Seconds to wait before retrying
This is the full code for job.
namespace App\Jobs;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\Http;
use App\Models\YourModel; // Replace with your actual model
use Exception;
class FetchApiDataJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public $tries = 5; // Number of retry attempts
public $backoff = 10; // Seconds to wait before retrying
public function handle()
{
try {
$response = Http::get('https://jsonplaceholder.typicode.com/posts');
if (!$response->successful()) {
throw new Exception('API request failed with status: ' . $response->status());
}
$data = $response->json();
foreach ($data as $item) {
YourModel::updateOrCreate(
['id' => $item['id']], // Adjust based on your model and data
[
'title' => $item['title'],
'body' => $item['body'],
// Additional columns
]
);
}
} catch (Exception $e) {
$this->fail($e); // Trigger retry
}
}
public function failed(Exception $exception)
{
\Log::error('FetchApiDataJob failed: ' . $exception->getMessage());
}
}
For any reason if this API call fails and didn't give valid response below if condition fails. And when if condition fails the code throws an exception. And then try catch fails. When try catch fails it runs the API call again according to retry logic we showed before
if (!$response->successful()) {
throw new Exception('API request failed with status: ' . $response->status());
}
2.Dispatch the Job from a Controller
To dispatch the job from a controller, you would typically do it in response to a user action, such as a form submission or a button click.
Here’s an example controller method that dispatches the job:
namespace App\Http\Controllers;
use App\Jobs\FetchApiDataJob;
use Illuminate\Http\Request;
class ApiController extends Controller
{
public function fetchData()
{
// Dispatch the job to the queue
FetchApiDataJob::dispatch();
// Return a response to the user
return response()->json(['message' => 'Job dispatched to fetch data.']);
}
}
3.Running Queue Workers
Make sure to run this command to run job queues. In server you can use supervisor for this.
php artisan queue:work
4.Notes.
Imagine this API call takes lots of time to run and slow down the controller. For this also the calling Third party API inside a job is a great solution. And to increase performance more you can use redis queues instead db queues.
Top comments (0)