Yes, maybe.
In Node.js projects, I use git flow hotfix based on the current master branch (because our CI deploys only from master after all tests pass). I fix my stuff, close the hotfix via git flow and our CI does the rest. It issues a remote command to do git pull into a new folder and then run npm install. Another script periodically checks for new folders, starts the node project inside of the new folder, waits for errors for about few seconds, updates our Nginx config, forces Nginx to reload its config and kills the old Node.js app. Almost zero downtime (waiting only for Nginx to reload, which finishes in few milliseconds).
For your PHP projects, I can imagine you compare your dist folders of the currently deployed version with the current ready-to-deploy version to pick and copy only the changed files. Shouldn't be a big issue. If needed, you can create checksums of your files before pushing them to a remote server. This way you can compare the hashes locally without having two copies of your project on HDD - if space or network bandwidth matters.