3
Wombat
2y

I have two laravel apps. Both sharing one redis db. One has App/Post one has App/Models/Blog/Post. When I unserialize models from redis cache saved by the other app I get issues because it cannot find the right model to hydrate.

How would you build a custom map to get the right model?

Comments
  • 2
    I believe this one is for you, @bittersweet. ❤️
  • 3
    Oof, that sounds very anti-pattern haha.

    I think the most correct architecture would be to have separate "persistance databases" per app/service where you store your models (MySQL) or cache your models (Redis Key/Value), and one shared "event pipeline" which pushes simple, DTO-like event objects between apps for communication (RabbitMQ, Kafka, Redis lists/pubsub/streams, etc).

    Laravel is very much designed to assume that it "owns" the database(s) it is connected to.

    But life is rarely ideal.

    So...

    If you're accessing Redis through the Cache facade, what you can do is write your own cache driver.

    https://laravel.com/docs/9.x/...

    That way, you can override the "remember", "put", "get", etc functions.

    Quite hacky to do that for the reason of mapping models though.
  • 2
    @bittersweet uff, that sounds a little over engineered for that problem. What I was doing is the replacement of an cached post via an admin dashboard located on a subdomain. This app has a newer laravel version since I have too little time to update the main app to version 8 or 9.... It still runs 5.6 (omg 🙈)

    I believed there would be a way to map those models like I was able to create a workaround for my polymorphic relationships with a custom morphmap. Unfortunately this seems to not be this easy on this case.

    I probably will just make a clear post key from cache button and let it get cached by the blog later on view.

    Seems to be the easiest solution for what I need.

    Thank you. I appreciate your help. 😌
  • 2
    @Wombat You could also just use the Redis facade directly.

    $post = Redis::get('blogpostcache:1')

    $blogpostcollection = Blogpost::hydrate([$post])

    $posts = Redis::mget($arrayOfRedisKeys)

    $blogpostcollection = Blogpost::hydrate($posts)

    But yeah if there are complicated relationships involved which are eager loaded and then serialized & cached on the "write side", it's gonna be a headache to deal with that on "read side".
  • 2
    @Wombat

    We also have multiple Laravel apps btw.

    The way we solved this is to have a "core platform Laravel app", which provides an API (we use Laravel Passport, but Sactum also works).

    A small part of that API are "admin" POST routes, only available for admins.

    Then our "admin panel Laravel app" has READ access to our database, and can show that data for our support department -- Things like where financial transactions might be stuck, login & event logs for users, etc.

    The admin app doesn't even use Eloquent models: It just does raw queries from a read-only replica of the DB.

    When the admin app needs to WRITE to the database (to refund/correct some financial mistake for example), it just does a POST API call using an admin token, to the main platform, which then handles the validation and stuff.
  • 2
    do you have the same cache prefix on both apps or do you have any at all??
  • 1
    @bittersweet thanks for the insight. I probably will have a few more thoughts on this issue.
  • 1
    @mowgli yes, sure. I have the same redis prefix on both apps. I can fetch the keys. That's not the point.
  • 2
    maybe that's the problem, they need to have a different redis prefix in order to get the information that belongs to the right application, if they all have the same prefix, it will cause a data collision.
  • 1
    @mowgli

    I think the issue is mostly in how the objects are stored.

    If you cache or queue an eloquent model, it will look like something like C:4:"Test":3:{foo} — but a few orders of magnitude more complicated, as there will be MANY more fields & methods on that class. Eloquent models are quite fat.

    Laravel assumes you will deserialize this model object in the same app again.

    So if your cached "user" has a relation to an "article" model, no problem.

    If you deserialize somewhere else, the namespace and content of your class might be different.
  • 1
    @mowgli

    I think he wants a data collision, as he wants access to the same object from both apps.

    Which is why I said "uh... anti pattern" 😅
  • 2
    @bittersweet woah, I wasn't expecting this one, that really changes the subject of the issue.
  • 2
    the only thing I can think of is using class aliases.
  • 1
    I didn't even expect that much feedback. Thanks.

    As @bittersweet pointed it out right, the issue I had was when deserializing my post models I got an error where the framework could not map that data to an eloquent model and then called functions on a string.

    Nevertheless I just circumvent this problem after realising that the solution will overwhelm my application in a form I don't want it to get, just for this little bit of a feature.
  • 0
    It’s not an anti pattern, it’s reasonable to have multiple apps use the same db.

    A shot in the dark: maybe you’re using different serialization methods.

    But I’m kind of scratching my head because eloquent only supports rdbs, and redis is a key value store.
Add Comment