The Situation
The benefits of automating software deployment processes are well known: efficiency, accuracy, security and an increase in productivity. But many organizations have stopped short of fully automating their deployment, especially in complex environments.
What’s holding them back? They are either missing the automation to take advantage of their virtual infrastructure, not using a virtual infrastructure or only doing static virtualization. Automation and virtualization are like peanut butter and jelly: they’re better together. If you try to automate without virtualization, you’ll miss the advantages of very flexible resource management. Conversely, virtualization without automation leads to developers hoarding their virtual machines in the same way they did when they had physical machines, which results in simply replacing your server sprawl with VM sprawl.
Automating software deployment is really a twofold process. You need to find an automation platform that works for your organization, but it gets dramatically better when combined with virtualization that provides dynamic resource allocation.
Going virtual
When it comes to deployment for either test or production, you’re almost never dealing with just one machine or server. Instead, you’re trying to juggle multiple servers, databases and testing processes. As testing becomes more complex and teams grow, trying to manage these resources becomes increasingly cumbersome. And at a certain point, sticking a Post-It note to a physical machine to claim it as yours while you deploy a piece of code just doesn’t cut it anymore.
Today, most organizations have some level of virtualization, but many have only gotten as far as static virtualization. They have replaced their physical machines with virtual ones, but there is little difference in the way the machines are managed. Developers still need to claim machines for the tasks at hand; it’s just that their Post-It notes are now virtual.
Dynamic virtualization, on the other hand, gives developers immediate and flexible self-service. Machines are created as they need them, and torn down and redistributed when they’re done.
Dynamic virtualization solves the resource coordination problem, giving teams the resources they need, when they need them, and lets them work independently of each other, without having to worry about which machines the rest of their team are using. It also eliminates the need for teams to guess ahead of time how many machines they will need to complete a task. Before dynamic virtualization, hours, days or even weeks of valuable development time could be lost if you guessed wrong. Now, resources can be made immediately available as they are needed.
We’ve seen the benefits of dynamic virtualization in our own office. Before, Electric Cloud needed several physical machines for each developer. By using dynamic virtualization, we now measure in developers per machine rather than machines per developer. The result is more flexibility because they don’t have to wait for a physical machine to be available.
Choosing the right automation platform
For some organizations, a traditional build/continuous integration tool like Cruise Control or Hudson/Jenkins is a fit, but these tools are often limited by the need to run on the same machine as the work they are coordinating. This limits these tools’ capability to dynamically provision resources. In addition, such tools store their configuration and metadata in the file system, not a database; this can be a barrier to scalability, reliability and high availability. It also makes it difficult to use traditional reporting tools to perform analysis on the data.
We’ve found tools like VMware vCloud Director essential because they let us manage virtual machine images and group them into multi-machine configurations. For example, a configuration with a server machine, a client machine and a database machine can be treated as a single unit of deployment. When you start doing this, simple automation falls way short.
When you are considering an automation platform for complex test and deployment, there are several things to consider:
- What will happen if our team doubles in size? Scalability is a big factor in determining whether a given automation platform is right for your organization. Some tools are great with smaller groups, but will get overloaded if your organization grows quickly. If fast future growth is a possibility, you’re better off considering the long term when choosing an automation platform.
- What if we change our processes or tools? You need to know if your automation platform will support changes like beginning parallel testing and deployment, or upgrades to the latest version of your tools. You’ll save time and resources as you implement your new system if you don’t have to revamp your existing process and tools. Chances are your environment is complex and constantly changing. You should never hear, “why would you do that?” from the provider of your automation platform.
- How secure is the platform? How much security matters to you will depend on the nature of your industry. Electric Cloud’s customers in the financial industry, for example, have very tight security requirements. Because of the regulations they are subject to, it’s important that Developer A and Developer B are unable to see each other’s work. For a lot of other companies that level of security is unnecessary, but knowing your security needs and whether your automation platform can support them is important.
- If we use a private cloud now, will our automation platform follow us to a public cloud? Many companies today are making their entry into cloud computing with a private cloud, but as hybrid and public clouds gain popularity, it’s worth thinking about whether your automation platform will be able to make the move with you. You should be able to write your procedures once and run them anywhere.
- How flexible is the system? Some tools are great at one type of testing—specializing in Javacode or Python, for example—but unable to handle a broader spectrum of languages. If your developers are coding in multiple languages, or may need to in the future, it’s best to choose an automation system that is tool-agnostic.
Getting these questions answered will provide you with a good framework for choosing a platform that fits your organization.
Once you’re up and running with automated deployment, you can start looking at more monolithic pieces of your process that you’d like to split. Chances are, they’re large because your old system lacked the means to share information between the various pieces if they were split. Your automation platform should enable you to address this problem.
With dynamic virtualization combined with the right automation platform, you’ll have what you need to bring your organization’s deployment process from brittle, cumbersome and time-intensive to reliable, secure and fast. It’s the perfect software deployment peanut butter and jelly sandwich.
About the Author Mike Maciag is CEO of Electric Cloud, Inc.