1
Anakata
4y

I’m trying to add caching functionality in my scalable spark cluster on Docker. I am able to add redis to my Docker container, but that doesn’t add redis-CLI or any other useful tools that I need. And there are some redis-cache projects on PyPi, but they are very old and not compatible with python3.

Comments
  • 1
    Just add another runstep to your mutlipart docker build that adds and configures path for redis-cli and a copy operation that imports the configuration. Rinse repeat for whatever else you need, then use copy --from to compose your final container.

    https://docs.docker.com/develop/...
  • 1
    But redis image (https://hub.docker.com/_/redis) totally comes with redis-cli, all you need is exec into the container and type `redis-cli`. What exactly are you trying to achieve?
  • 1
    You dont simply add redis to you existing docker image. You use a separate redis container
  • 0
    @ebrithil I though storing data in a container is a bad practice. Because you lose the data when the container shuts down.
  • 1
    @srpatil who said you store data inside the container? Usually you use either volume or bind mounts for persistent data
Add Comment