3

Which one shall I go for optimal workflow? pictures in the comments.

Comments
  • 0
  • 0
  • 0
  • 2
    The first one, you don’t want to be pushing from the local vm
  • 2
  • 0
    @LiterallyJesus What about 4? I would like to have a clone of the vps as VM only for testing before throwing up directly from dev environment, but I have no idea how to manage the workflow.
  • 2
    @blindXfish i work kinda like 4 but without the XAMPP.

    so, all dev is done on a Local VM utilising feature branches.

    push to Github and merge with development / staging branch.

    Test this in a production clone environment and once happy and defects triaged, merge the staging branch to master branch.

    master branch is used to deploy production servers.
  • 4
    @blindXfish much better, you can also hook up the vm to pull with a webhook so you don’t get a delay
  • 0
    My setup:

    1. Docker-compose running locally. It includes PHP-FPM, MySQL, Redis, Elastic and a few other services.

    2. Docker mounts a directory from host as /var/www. An env file instructs docker which directory to mount, and how to forward certain ports.

    3. I edit this mounted directory with an IDE, and use git to push the contents to a codebase repository on Github.

    4. Github Actions unit-tests the code. The pipeline pulls in the Docker repo, AND the codebase repo.

    5. If the tests are green, the pipeline uses the docker config & codebase repo to setup a kubernetes cluster, for as long as the PR remains open on Github. A bot posts the hosting URL as a comment on the Github PR.

    6. The PR requires a tech review from a senior developer, and functional review from a PM or QA employee.

    7. When the PR is merged to master, a pipeline runs to use the exact same docker/kubernetes and codebase to deploy on production.
Add Comment