6

Okay. So finally I moved into a new pc. Because I never worked in a company, I have absolutely no idea what is the proper standard workflow of developing a website. My work flow was the same in past 12 years, how I have learned in the school: Used xampp, developed everything, used git only locally, when stuff was ready I fired up Filezilla and uploaded everything, and used ssh to make the final adjustments. When I have made some changes, I just uploaded the files I have touched, in the same way, optimized if it was necessary, done. I wonder if someone can clarify me how a proper workflow looks like for php/laravel, mysql, nothing fancy. Is using xampp still okay? Or what is the industry standard procedure?

Comments
  • 5
    I'd say it's changed a little (or not so little). Can't tell you for PHP specifically, but in a lot of places things generally look like this:

    Everyone on the project uses a repo. There's a CI/CD running on your staging and prod environments; these watch a certain branch of the repo (either the CI/CD itself is responsible for that or there's a git hook that's triggered on push/tag). This way you don't have to touch any files manually with an FTP client - all building/uploading is done by the CI/CD (often only after tests are passed). It's assumed that you have to pull the latest master/staging/whatever before you push.
  • 2
    I've seen everything, and all flows still being used to this date. XAMPP/WAMP, php dev-server, docker devbox, and even SSH tunneled remote via VS Code for local. Deploy via (S)FTP, rsync/ scp via SSH, CI/CD through Git.

    What @kamen said is industry best practice. If your app requires a lot of dependencies and multiple devs on potentially different OS'es, a docker devbox is a must, else a php devserver or local Apache server with vhosts may be enough
Add Comment