TL:DR; use a delayed dispatch plus the ShouldBeUnique
contract on your job, and let the framework deduplicate it for you!
Picture this scenario: you receive webhooks from a third-party service that you need to handle somehow in your application. Due to multiple changes at the third party, you may receive multiple webhooks about the same resource within a short timeframe (2 minutes, for example), but you only want to process it a single time.
You could add a processed_at
timestamp and track when you last processed the resource, and bypass additional processing using that.
Or you could combine several tools the Laravel framework already provides:
- Determine the expected timeframe for multiple updates
- e.g., if somebody is manually updating multiple fields and you get a webhook for each change, estimate how long a user might be working before you want to process the changes
- Add a delay when you dispatch the job to cover that expected timeframe, plus a little bit extra:
SomeJob::dispatch($data)->delay(now()->addMinutes(3))
- Add the
ShouldBeUnique
interface to your job - Add the
uniqueId()
method to your job with a unique ID or some other key that will be used to find a match - Voila!
Now when you receive an incoming webhook, your app will delay the processing for 3 minutes, and if there is already a job on the queue for that unique ID, the framework will not dispatch a second job.
One additional thing to consider: within that job, you may wish to perform an API call to retrieve the current state of the resource, since the dispatched job will contain the state as of the first webhook, not the most recent.
<?php
// somewhere in your app
SomeJob::dispatch($data)->delay(
now()->addMinutes(3)
);
<?php
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Contracts\Queue\ShouldBeUnique;
class SomeJob implements ShouldQueue, ShouldBeUnique
{
/**
* Get the unique ID for the job.
*/
public function uniqueId(): string
{
return $this->data->id;
}
...
}