Tuesday, October 28, 2008

Headless build problems with Grouped Configurations

PDE/Build has a property named groupConfigurations. If you set this property, build will group all the configurations (ie win32,win32,x86 & gtk,linux,x86) into one archive instead of creating a separate archive per configuration.

Kim & I, while working to get the performance baselines going, found a couple of gotchas/bugs around the use of this property.

Unexpected archive format for the group

The first thing we noticed was that the group archive was being created using ant's zip task instead of using the native zip as expected. Our builder's build.properties specified the following:
configs = *,*,*
archivesFormat = *,*,*-zip
groupConfigurations = true
If we had read the documentation, we would have seen that the archivesFormat is ignored for groups. However, this is not strictly true, and we can instead set a format for the group directly:
configs=*,*,*
archivesFormat = group,group,group-zip
groupConfigurations = true
This results in the zip format as desired.

The directory group.group.group does not exist

We also ran into the following error:
/builds/src/assemble.org.eclipse.sdk.tests.group.group.group.xml:257: The directory
/builds/src/tmp/eclipse/group.group.group does not exist
This turns out to be a rather old bug. The problem is that when features gather together the rootfiles they contribute, they copy them into platform specific folders. In this case, building the *,*,* configuration, the rootfiles were copied into tmp/ANY.ANY.ANY. However, because of the way the grouped configurations feature is implemented, the assembly scripts end up looking for the rootfiles in the group.group.group folder.

There are a few workarounds to this problem. One is to stick a mkdir in your customTarget/allElements assemble.<feature-id>.group.group.group target. This avoids the error, but you don't get the rootfiles in your group archive.

If you are using 3.4, then you can use the pre.archive target in customAssembly.xml to collect all the rootfiles into the correct folder:
   <target name="pre.archive">
<!-- for each config being built -->
<move file="${eclipse.base}/ANY.ANY.ANY/${collectingFolder}"
todir="${rootFolder}" failonerror="false"/>
<move file="${eclipse.base}/win32.win32.x86/${collectingFolder}"
todir="${rootFolder}" failonerror="false" />
<move file="${eclipse.base}/linux.gtk.x86/${collectingFolder}"
todir="${rootFolder}" failonerror="false" />
</target>

The ${rootFolder} property is defined by the caller of the pre.archive target, and in this case will be ${eclipse.base}/group.group.group/${collectingFolder}.

As usual, I have not actually tried running the above ant, the details may be different.

Wednesday, October 15, 2008

Sorting Bundles and Parallel Compilation in PDE/Build

In PDE/Build the compile order for bundles has always been based on the feature structure. Features are visited depth first, and for each feature the included bundles are sorted according to their dependencies. Dependencies outside the given feature are not considered, and must be included in a previously visited feature.

This can lead to some less than ideal feature structures as releng teams try to ensure that everything that a bundle depends on is included in a "deeper" feature.

This has been fixed for 3.5 M3. You can now define a property "flattenDependencies=true" in your build configuration build.properties file. This will result in bundles being sorted across feature boundaries.

Previously, bundles got compiled by delegation through the build.xml scripts for the containing features. When using the new flattenDependencies option, a new compilation xml script will be generated in the build directory. This only affects compilation, other build stages (ie gather.bin.parts) will still be delegated through the feature structure.

Parallel Compilation

With the above changes in compile order, it turns out to be a small step to get parallel compilation. Set both flattenDependencies and "parallelCompilation=true" in your build configuration. The result is that the compilation xml script will then group bundles using ant's parallel task. The result is something that looks something like this:

<target name="main">
<parallel threadsPerProcessor="3">
<ant antfile="build.xml" dir="plugins/org.eclipse.swt" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.swt.win32.win32.x86" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.osgi" target="build.jars"/>
</parallel>

<parallel threadsPerProcessor="3">
<ant antfile="build.xml" dir="plugins/org.eclipse.osgi.util" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.transforms.xslt" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.supplement" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.simpleconfigurator" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.p2.jarprocessor" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.launcher" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.launcher.win32.win32.x86" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.common" target="build.jars"/>
</parallel>

<parallel threadsPerProcessor="3">
<ant antfile="build.xml" dir="plugins/org.eclipse.update.configurator" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.equinox.frameworkadmin" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.cvs" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.core.runtime.compatibility.auth" target="build.jars"/>
<ant antfile="build.xml" dir="plugins/org.eclipse.core.jobs" target="build.jars"/>
</parallel>
....

Each group depends only only bundles that appeared in a previous group. You can control the ant threading attributes by setting parallelThreadCount and parallelThreadsPerProcessor.

We tested this by using it to compile the Eclipse SDK. Compile time dropped from 6:53 to 4:54, while this is only a 2 minute savings, it is a 29% improvement.