fbpx
1-888-310-4540 (main) / 1-888-707-6150 (support) info@spkaa.com
Select Page

A Brief History of Automated Builds

Almost every programming book starts with an example of a small program which can be compiled from the command line using a simple call to the compiler with maybe a few flags. Known as “Hello World”, the few lines of code needed to output the text is almost universal.

What the reader takes away from the example is how to call the compiler to produce a build. However as they progress in learning the language, it soon becomes clear that multiple source files are needed and that compiling these manually is tedious and error prone. The solution is either to use some kind of script which compiles the modules or use a build tool.

The Unix operating system (and its derivatives / clones like FreeBSD and Linux) includes a tool called Make which automated builds based on a configuration file (a Makefile) describing which source files are needed by which components of the build. Due to its inclusion in Unix and the Unix-like operating systems, Make has become fairly universal. There are different versions, including a version by Microsoft for use on its Windows operating system. However Make isn’t the only build tool. For example, the Apache Ant build system is very popular for Java.

The problem with tools like Make, when used in their standard configuration, is that the compiling is performed sequentially. This means that one file is compiled and then the next and so on. When there are thousands of files with millions of lines of code this can be very slow. With the advent of cheap multi-core processors it became practical to compile source files in parallel. Make offers a flag (-j for jobs) which tells Make to perform multiple compiles at the same time. On a 32 core machine with solid state disks (SSDs), the compile time of the Linux kernel can be reduced from hours to under a minute using parallel building.

However the problem with parallel building is resolving the dependencies. If the Makefile doesn’t precisely define the dependencies of each module the build times might remain high (as the high level of cross dependencies means that the build process remains more sequential rather than parallel) or the build process breaks as modules are compiled and linked in the wrong order.

Tools like ElectricAccelerator are able to analyze a build system and create a dependency map. One way it does this is to monitor the usage of each file and detect when it is used by the build process. Such maps guarantee that the build is consistent and doesn’t break due to out-of-order compiles.

ElectricAccelertor also uses caching technology to reuse the output of previous compilations and so avoid unnecessary compiles. There is also a tool which can be used together with Make known as ccache which performs a similar function.

Conclusion

On large projects, build times significantly influence productivity. Enabling more sophisticated builds using parallelism and caching enables builds to be produced quicker and allows downstream activities (such as testing) to continue without hold-ups.

Latest White Papers

The Next Chapter of Jira Service Management

The Next Chapter of Jira Service Management

The service industry is only becoming more competitive as the years pass, making efficient delivery vital to success. Development and Operations teams need to work together to deliver aid and Jira Service Management can help achieve this. Explore the future of Jira...

Related Resources

Seamlessly Transitioning from Rally to Jira Using OpsHub

Seamlessly Transitioning from Rally to Jira Using OpsHub

Migrating or integrating project management systems can be challenging, but it doesn’t have to be. While every tool comes with its own unique data structures and workflows, OpsHub Integration Manager offers an ideal solution. OpsHub allows businesses to continuously...

Exploring Modern Software Deployment Strategies

Exploring Modern Software Deployment Strategies

Deploying software can feel like a gamble due to all the strategies and solutions on the market, but it doesn’t have to be. Discovering which software deployment strategy works best for your organization is a great place to start. This strategy, combined with a modern...