10
REXTON
21d

At work everybody uses Windows 10. We recently switched from Vagrant to Docker. It's bad enough I have to use Windows, it's even worse to use Docker for Windows. If God forbid, you're ever in this situation and have to choose, pick Vagrant. It's way better than whatever Docker is doing... So upon installing version 2.2.0.0 of Docker for Windows I found myself in the situation where my volumes would randomly unmount themselves and I was going crazy as to why my assets were not loading. I tried 'docker-compose restart' or 'down' and 'up -d', I went into Portainer to check and manually start containers and at some point it works again but it doesn't last long before it breaks. I checked my yml config and asked my colleagues to take a look. They also experience different problems but not like mine. There is nothing wrong with the configuration. I went to check their github page and I saw there were a lot of issues opened on the same subject, I also opened one. Its over a week and I found no solution to this problem. I tried installing an older version but it still didn't work. Also I think it might've bricked my computer as today when I turned on my PC I got greeted by a BSOD right at system start up... I tried startup repair, boot into safe mode, system restore, reset PC, nothing works anymore it just doesn't boots into windows... I had to use a live USB with Linux Mint to grab my work files. I was thinking that my SSD might have reached its EoL as it is kinda old but I didn't find any corrupt files, everything is still there. I can't help but point my finger at Docker since I did nothing with this machine except tinkering with Docker and trying to make it work as it should... When we used Vagrant it also had its problems but none were of this magnitude... And I can't really go back to Vagrant unless my team also does so...

Comments
  • 2
    Docker volumes doesn’t work properly anywhere else than linux. Even though I hate windows, at my company I tried to push people to start use docker, and windows people were there most problematic as it’s quite hard to make docker work on it. Anyway, I achieved my goal, basically follow the tutorial https://nickjanetakis.com/blog/... but skip steps related with volumes. Then install docker-sync, it will require unison which you will need to build manually. Docker-sync will require some configuration changes, but if everyone uses windows it will be for the better. You will need to work inside containers though. Quite some work, but is manageable.
  • 2
    What I don't understand is why you don't have prod/test/dev server infrastructure to host your docker images on. It seems beyond crazy to me that you would be running something important on your individual desktop, regardless of what OS it's running.
  • 1
    @SevenDeadlyBugs thanks for sharing! Will give it a try after I reinstall windows.
  • 0
    @REXTON Omg I would cry.
  • 0
    > BAD_SYSTEM_CONFIG_INFO
    fix your fucking registry, external registry editing tools exist.
    EDIT: https://bleepingcomputer.com/downlo...
  • 0
    @bahua When we build and Angular front-end we build a Docker image and use our local hard drive as the place the project lives. That means all developers are running the code on the same version. Nothing got into the docker container unless someone added it to a config file and rebuilt the docker image. We have a .env file for proxy and environment variables. If it builds and runs on on a development machine it works in the cloud. The devs can tweak their systems for maximum productivity and it never affects the way the code runs unless the change is deliberately added to the container. So developers pick their tools and the code lives in a bubble which acts exactly like the prod container.
  • 0
    @irene

    But I just cannot see the advantage of running it locally, over having a dedicated environment that doesn't depend on an end-user desktop. All I can see is disadvantage and hassle.
  • 1
    @bahua it trades one inconvenience (local variations) for another (foreign environment, tools only work remotely unless they're designed to work through ssh, far fewer gui tools, far more latency and bandwidth sensitive, more points of failure, more hardware cost, ...).

    There are reasons to do both, a remote dev env is far from a panacea.
  • 0
    @groxx

    Ah, I hadn't considered graphical desktop code applications. I do everything with vim over ssh, so it hadn't occurred to me that others might not.
  • 0
    @bahua Especially useful in front end web development because you need to see what a page looks like in the browsers. The container watches for file changes on a shared docker volume (project folder) on the development machine. After the code is merged it is normal devops from there out.

    Developers can use whatever tools they like then see the changes in the container in 2-3 seconds.
  • 2
    Mickeysoft prevails in stupidity, again!

    I implore you to read the documentation in regards to windows hosts. There are serious caveats. Don't ask me which because i have no idea. I don't work with inferior operating systems.
Add Comment