What is the best way to work with continuous integration without using Docker?

Asked

Viewed 235 times

1

I have an e-commerce application in php that has more than 30k files.

I am implementing the continuous integration process with Jenkins + SVN + Automated testing ( build and deploy ).

But the application is not using Docker. So with each commit, I made a script that takes the updates and sends them to the homologation server.

It would not compensate to work with rsync due to the large amount of files that exist in the application or would it give?

I tried another way to do this, creating a package already with the latest version synchronized during the build. Then in the deploy, the "container" would be reinstalled in the application. However, this way, the deployment process would be slower than simply sending updates.

The main problem would be when folders would be deleted or created, since it is not done rsync.

  • Docker and rsync are just a few more ways to deploy an application. You need to ask yourself: What does your current solution leave you wanting? It’s worth changing your current workflow?

  • In the current situation it would not compensate to reconfigure the entire application to use it, due to the time spent optimizing and configuring the server to run with php7, apc-cache, etc, unless it saves time in the future. It is that seeing the tool that is the Docker and what it brings of resources, not using it would seem setback.

  • 1

    I have a simple configuration, in my case I only change svn for git. Here we separate code files (images, Docs, audios and etc). The web server is a git Repo. At every Jenkins build only the source files are updated by git pull, other files exist an rsync.

No answers

Browser other questions tagged

You are not signed in. Login or sign up in order to post.