Using Eloquent models in dispatched jobs and closures

At Laracon EU 2024, Laravel core team member Tim MacDonald gave a talk called "Thoughtful Performance." You can watch the entire talk on YouTube: https://www.youtube.com/watch?v=NcucthcnGY0. It's overall an incredible talk, and I encourage everyone to watch it. While watching, something did catch my eye, and I thought that might be fun to write about.

One of the points Tim talked about is "doing things later." He discussed how you can use Laravel's dispatch()->afterResponse() function to offload some processing after an HTTP response is sent to the user — so that the user doesn't wait for something that doesn't concern them. Read more on Queueing Closures in Laravel documentation.

I love this function and use it a lot — however, there's a little caveat in the example that Tim presented. Before I continue, I know Tim just gave an illustration of what you can do with dispatch()->afterResponse(), but I still wanted to explore what happens in this exact case and maybe educate someone who doesn't know about this. Consider the code snippet from the presentation:

$post->delete();

dispatch(function () use ($post) {
    Activity::log(new PostDeleted($post));
})->afterResponse();

return Redirect::route('dashboard');

In this example, the user doesn't really care about logic related to activity logging. They should receive the response immediately, and we should do our miscellaneous developer work afterwards. Let's assume Activity::log() will just store the new record in the database. If you try this in your browser, you'll actually see that... activity hasn't been stored in the database. So, what exactly happened there?

Let's simplify this by just writing the Eloquent model to the log file. Consider the following code:

public function destroy(Post $post): RedirectResponse
{
    $post->delete();

    dispatch(function () use ($post) {
        info($post);
    })->afterResponse();
    
    return redirect('/');
}

If you open your log file, you'll see that it's empty. Now try to remove the line where we delete the post and try again.

public function destroy(Post $post): RedirectResponse
{
-   $post->delete();

    dispatch(function () use ($post) {
        info($post);
    })->afterResponse();
    
    return redirect('/');
}

Yup, there's the model in our log file.

[2024-02-27 09:13:19] local.INFO: {"id":1,"title":"Hello World","created_at":"2024-02-27T09:12:28.000000Z","updated_at":"2024-02-27T09:12:28.000000Z"}  

Even if you move the $post->delete() line to execute after the dispatch()->afterResponse(), you'll still get the same thing — the log file will still be empty. We can clearly see that it's related to the fact that we're deleting the post from the database, so let's inspect what is really happening there. This can be somewhat confusing at first, seeing as we use the model in the closure that was created by that same HTTP request.

Queues in Laravel

Before I dive into specifics, I think it's important to know that Laravel has an incredible queue system. I often talk about it; it's my favorite thing in Laravel, and it's enough to have me use Laravel until other web frameworks come up with a solution that's even remotely close to Laravel's.

Now, let's see what is happening when we push jobs onto the queue.

Serializing jobs

Consider the following code you'll see all the time:

Bus::dispatch(new ProcessPodcast($podcast));

When you create a job class and dispatch it to the queue, Laravel needs to do a few things to make sure the queue worker can properly process the job. When processing the job with a queue, Laravel will do the following:

  1. store the details of a job in the service our app uses for queues (very commonly Redis)
  2. queue process will continuously loop and pick jobs out of the service (Redis) when it's their time to execute
  3. Laravel will then execute the handle method of the job

Of course, this is just a high-level overview and oversimplification of the entire process.

However, Redis is a completely different service; it's not PHP, it's not our application. How does Laravel "store" the job details in Redis? We can create PHP objects with any arguments (models, strings, booleans, whatnot) we want. How will Laravel write the PHP object to Redis? Well, when storing the job details in our queue service, Laravel will perform the process called serialization. Serialization will essentially convert that job object into a string, and then store that string in Redis. When it's time for the job to run, Laravel will take that string out of Redis and deserialize it — or convert it back into the PHP object. Then, Laravel can execute the handle method of that object.

As a part of that serialized string, Laravel will include all the details that are needed to rebuild the serialized string back into the PHP object. If you consider the following job class, below is the serialized string (formatted JSON) after it's been dispatched to the queue.

class ProcessPodcast implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    /**
     * Create a new job instance.
     */
    public function __construct(
        public Podcast $podcast
    ) {}

    /**
     * Execute the job.
     */
    public function handle(): void
    {
        info('Processing...');
    }
}
{
   "uuid": "f0c17f71-4f29-4b7c-a8aa-6a9a478d48a8",
   "displayName": "App\\Jobs\\ProcessPodcast",
   "job": "Illuminate\\Queue\\CallQueuedHandler@call",
   "maxTries": null,
   "maxExceptions": null,
   "failOnTimeout": false,
   "backoff": null,
   "timeout": null,
   "retryUntil": null,
   "data": {
      "commandName": "App\\Jobs\\ProcessPodcast",
      "command": "O:23:\"App\\Jobs\\ProcessPodcast\":1:{s:6:\"podcast\";O:45:\"Illuminate\\Contracts\\Database\\ModelIdentifier\":5:{s:5:\"class\";s:17:\"App\\Models\\Podcast\";s:2:\"id\";i:6;s:9:\"relations\";a:0:{}s:10:\"connection\";s:5:\"mysql\";s:15:\"collectionClass\";N;}}"
   }
}

Laravel will use this JSON to rebuild the job object when deserializing (when picking the job from Redis to be executed by the queue worker). It knows which class to create object from (data.commandName), and it can "populate" the class properties and arguments from data within data.command. How it actually performs that deserialization from JSON into an object is currently not important. After the deserialization process, we have our ProcessPodcast object with all the arguments and properties as it had before; now Laravel can execute the handle method.

However, we know that our job class accepts the Podcast model in the constructor. The queue system was built with Eloquent models in mind, so Laravel will know how to serialize Eloquent models, and to a very light form - it will serialize the model class, primary key, and currently loaded relations. Why does it only serialize those? Let's tackle that in the following chapter.

Loading models from the database

Now that serialized job can be rebuilt back into the PHP object, we have all our properties and arguments of the job back (you might call it state). But what about Eloquent models? How will it bring them back? We use those in the handle method, so we need them. As you can probably already tell, when deserializing, Laravel will execute a database query to fetch the "fresh" model record from the database using properties it serialized — primary key & loaded relations.

I'm hoping you can already understand why, in our example, deleting a post is a problem. We deleted the post and then dispatched something onto the queue that uses that post. When deserializing, that model no longer exists in the database. Laravel has good documentation on this and allows you to override how to handle jobs when models are missing from the database: https://laravel.com/docs/10.x/queues#ignoring-missing-models.

By default, Laravel will just discard the job if the model is missing from the database.

Serializing closures

So far, we talked about job classes. But, in this example, Tim talked about queueing closures. The good thing is, everything still stands. Just like with objects, Laravel will serialize the function, store it in the queue, and deserialize it back when it's time to run. It will also serialize all the "use" variables (e.g. function () use ($post)), just like it serialized class properties and constructor arguments in objects.

Serializing closures is a lot more trickier, though, and you can see the implementation on GitHub: https://github.com/laravel/serializable-closure. Here's a (simplified) serialized representation of a queued closure:

array (
  'use' => array (
    'post' => \App\Models\Post::__set_state(array(
       'connection' => 'mysql',
       'table' => 'posts',
       'primaryKey' => 'id',
       'keyType' => 'int',
       'incrementing' => true,
       'preventsLazyLoading' => false,
       'perPage' => 15,
       'exists' => true,
       'wasRecentlyCreated' => false,
       'escapeWhenCastingToString' => false,
       'attributes' => array (
            'id' => 8,
            'name' => 'Test',
            'type' => 'test',
            'created_at' => '2024-02-27 09:21:54',
            'updated_at' => '2024-02-27 09:21:54',
        ),
    )),
  ),
  'function' => 'function () use ($post) {
        \\info($post);
    }',
  'scope' => 'Illuminate\\Routing\\RouteFileRegistrar',
  'this' => NULL,
  'self' => '00000000000002150000000000000000',
  'objects' => array (),
)  

You can clearly see the entire closure code stored under function key. With this, you can see why post is not available when closure gets executed. But, there's another thing here: when you mark a closure/job to run after response, it's not processed by the queue worker.

It's not executed by the queue worker

If you set your queue connection to database with QUEUE_CONNECTION=database in your .env file and run the code from above without running your queue worker, you'll see that jobs table in the database is empty. If you remove the $post->delete(), you'll see that even though you're not running a worker, the closure will execute. Why is this happening?

Turns out, adding ->afterResponse() will tell Laravel to run the code with sync connection (in the same request lifecycle), but when the app is terminating. This happens after an HTTP response gets sent to the user. Here's the code snippet from the framework, in which you can see the call to the dispatchSync method:

/**
 * Dispatch a command to its appropriate handler after the current process.
 *
 * @param  mixed  $command
 * @param  mixed  $handler
 * @return void
 */
public function dispatchAfterResponse($command, $handler = null)
{
    $this->container->terminating(function () use ($command, $handler) {
        $this->dispatchSync($command, $handler);
    });
}

In the end, it does appear as if this closure is sent to the queue after the response, but it's actually executed within the current HTTP request. Laravel does mention this in its documentation.

Since they are processed within the current HTTP request, jobs dispatched in this fashion do not require a queue worker to be running in order for them to be processed.

One thing I did notice is that even with the fact that this closure is called within the same HTTP request, Laravel will still serialize and deserialize it, reintroducing our "missing models" issue.

I have also noticed that if models are missing from the database, closure will not execute and no exception will be thrown, even if you try to catch any exceptions (with ->catch()). In fact, there's no way to override the $deleteWhenMissingModels on queued closures.

public function destroy(Post $post): RedirectResponse
{
    $post->delete();

    dispatch(function () use ($post) {
        info($post); // Will never be called...
    })->catch(function ($exception) {
        info('Failed...'); // Will never be called...
    })->afterResponse();
    
    return redirect('/');
}

Bug or a feature?

You might be asking yourself, if closure is never sent to the queue when dispatching after response, then why is it even serialized and deserialized if it's going to run in the same HTTP request? Honestly, I have no idea. Is this the intended behavior? My guess is that this is either an oversight, or it's intended to avoid discrepancies between executing queued closures from queue worker and from the same request.

Is it just an oversight to not be able to override the $deleteWhenMissingModels when queueing closures? I have no idea. And I'm not smart enough to create a PR to fix these issues.

Conclusion

The conclusion of this article is just to be careful when using Eloquent models in queues and queued closures if models are deleted elsewhere. The talk presented by Tim is still very good.

👋 I'm available for software development contract work and consulting. You can find me at [email protected]