How We Manage Development - Automated Builds
Filed under: #daxmusings #bizapps
It’s been a while so it’s about time I continue the “How We Manage Development” articles. If you missed it, I wrote about our architecture of TFS and Dev machines and how we organize TFS projects. Although we continuously adjust our methods and some of the details in those posts have changed, the essence still holds and I’m hoping you’ll find those articles a good read.
In this post I will focus on our build process and some specific issues I’m sure anyone who’s tried to do automated builds has encountered at some point. To get started, I need to explain we have two different build workflows. One is for a test environment, where we need to update an existing environment. The second is for models that need to be deployed to customers. The difference is that for a test environment, you can’t remove the existing code since that would result in data loss. The second does remove existing code as that would be equivalent to a “clean” action that every build process should do. Ultimately, the test build produces a model store export file (we don’t actually build in the test environment, we build somewhere else and then deploy the new modelstore), the release build process produces an model file.
Our current model for builds has changed over what we used to have. We used to have a separate (dedicated) AOS on each customer’s development VM that was just used for builds (we still have this process for AX2009 environments though). Today, for AX 2012 environments, we just have two dedicated build machines: one for 2012 RTM and one for 2012 R2. Yes, we support clients with all sorts of levels of updates, but consider that for compiles the only thing that matters is the code. So we always build with the latest kernel version (for example, R2 CU7), but we use the exact patch-level for each customer (for example, R2 CU5 with some custom hotfixes). To support this, the first step of our build process is to restore a correct database for the version we need to build (more on this later). Now, since we are using the latest kernel version, if we were to export a model store or model file we would have a version issue. So, we have a repository of AXUTIL versions for different kernel versions of AX, and the build will use the correct axutil version when it exports the model or model store. This sounds like a hack and I guess it sort of is, but it works perfectly so far. If we ever run into a CU that is somehow not compatible, we’ll have to setup a different build server for that specific CU going forward. Again, we have all levels of CUs across our customers and so far we haven’t had any compatibility problems. And the nice thing is, we can compile our <CU7 clients using the newer CU7 axbuild process ;-)
So what are the steps in our build process? Considering we are using the same physical machine and the same AOS instance for multiple clients on multiple versions with multiple sets of code, we have some precautions and failsafes in place.
-
Setup the AOS to use the right database. We used to flip the AOS configuration to point to a different database. We’ve changed this step now to just restore the database we need. This has the advantage of not needing the database already on the build machine’s SQL server (meaning we can setup new build machines and they’ll just pick up databases from a shared drive to restore). And it also saves space on the local SQL server on the build machines since we keep overwriting the same DB. This also has an advantage that we don’t run into any issues removing code first, as the database we’re restoring will be in working order. Sometimes removing models prior to starting the build can cause issues with synchronize or other things.
-
Do a “Cleansing” of the solution. Since the previous build may have been for a totally different application version or code base, we don’t want to have to deal with any remnants. So, we delete the XPPIL artifacts, VSAssemblies, Appl files such as labels, etc. Also on the client-side, we clean the VS Assemblies folder for the build user and delete AUC files.
-
Combine all the XPOs from source into 1 big XPO. You can use the standard Microsoft one provided at Information Source or you can write your own or use our simple open source one. At this point I’m guessing the Microsoft one will have some benefits but we still use our own simple one and it works great.
Now, here’s where we need some explanation of model store build versus model build. In the model store situation we are not going to uninstall the existing code. However, if we import the XPO with all the code, do we really want it to delete “sub-elements”? What if there is code from another model (in the same layer) that adds a method to the same class for example? If we import the XPO for our model, it will delete that method? We need it to be model aware, and XPOs just aren’t. So, to work around that issue, we use a temporary database… create our model, import the xpo, extract the model and then import that model into the actual model store. This will have the model import take care of deleting sub-elements, and that will be cleaner and take care of the model specifics. To save time we don’t compile or sync or anything in our temporary database, we’re just happy if the code is there.
-
Uninstall all models from all custom layers. Now that we restore database from scratch, we could skip this step. I guess we still have it in there :-) It doesn’t add much overhead except for the next step (5) which should be done if we do this step. For our temp database, or our actual database if we’re just building a model, we clean out all models in the custom layers (from ISV all the way up to USP). This is technically also part of a cleaning of the solution, as it makes sure there are no weird remaining artifacts from a previous build that will skew the compiler results. If we have dependencies on other models, we’ll reinstall those back in later. In some cases there may be several ISV models that can’t be installed cleanly together without merging. We have an option to exclude certain layers from being cleaned up, so that we can create a base database containing ISV products that we can restore, and then not remove those products. These should be exceptions as we want to start the build as close to standard AX as possible. Again, if we restore the database we could assume there’s nothing in it that needs to be removed…
-
Start AOS and synchronize. Since/if we removed code, we want to synchronize before we continue. If we don’t synchronize and re-import the code, IDs will be different but the DB will still have the old artifacts and IDs, resulting in synchronization errors later on. We have an option that for a temporary database import (as explained above) we can skip this step as we don’t care about synchronizing (and save a bit of time).
-
Stop AOS, deploy references, start AOS. for temporary databases, we skip this step for now and perform it later for the actual model store database. Lots of times code depends on external assembly DLLs (references). Since we are using the same machine to build all sorts of different environments and different versions, we shouldn’t (can’t) actually install the software or DLLs in the right places. Since we need them to compile the code, and the compile runs on the client under a specific user, we can copy all needed DLLs into the VSAssemblies folder for the build user. We store all the correct DLLs and correct versions with the project’s code in the source control repository. It makes sense, you version your dependent binaries as well. And that’s how we can get to them from any build machine. Also, our code may depend on third-party models. Since we deleted all models, we have to re-import them unless we have them pre-installed on the DB backup and have set the build to skip cleaning that layer. So, same as the DLLs, we have the dependent models in the source control tree, so they get pulled onto the build machine and we just install them into the model store we’re about to build.
-
Import labels. We import label files using the client executable and -StartupCmd=aldimport_filename but there are ways to do it with autorun as well I think. Now, a lot of people (including ourselves) have had numerous problems getting the AOS to grab new labels or create labels. Labels don’t show up, or old ones do but new ones don’t etc. Additionally, sometimes they do show up but if you export the model it doesn’t contain them. So, here’s the scoop on that: 1) make sure you have delete the old label files from the server appl folders. 2) (super-secret trick) after importing the labels we use an autorun XML file to call Label::flush to make sure the client/AOS flush the labels down into the model store so the export works.
-
Import the combined XPO file. Now, the combined XPO file doesn’t contain the VS projects, we deal with those separately. To import the XPO we use autorun. We used to use the client’s import xpo startup command but autorun has some advantages (including logging) and seems more stable.
-
Import VS projects. Technically you can convert VS project files into an XPO and import that. The standard combine XPOs tool doesn’t do this I believe, and we have had unreliable results importing VS project XPOs. So, alternatively we are using autorun to call SysTreeNodeVSProject::importproject. Now, Microsoft just told me about another trick where you can use the msbuild process to call the “add to AOT” on the project, as you would from VS manually. I have to figure out how to do this as it would probably solve a few remaining issues with import projects. But for any normal VS projects, the static call to importproject should work great, and that’s what we currently use successfully.
If we’re doing a temporary database to create a model to update a model store, this is where we stop and just export the model as-is without compiling. We then switch to the actual model store and import the model we just exported. Note that that will also correctly update the version number of the existing model.
-
Compile etc Now we’re back in sync with both types of builds. We run the X++ compile, generate CIL and run a sync. Here we just added the option to compile “traditionally” using the client, or using the multi-threaded axbuild utility in CU7. If any of the steps (compile, CIL) log any errors in their log file, we fail the build. If the sync fails somehow, we generate an error in the TFS build log, which results in a “partially succeeded” build.
-
Extract the code Now we can extract either the model store or the model file. Note that you never want to extract the model store from the build that cleaned out all the code first, since that will have all new IDs for all tables. That’s the exact reason why we have two distinct build workflows.
Obviously we use TFS, but it should be clear that these steps can just be incorporated into PowerShell scripts and run manually without using TFS build. All of the code we use for this is stored in a class library, which then has a TFS Workflow Activies front-end as well as a PowerShell front-end. We are close to finalizing the CU7 axbuild pieces and then we can do a major release of our utilities. But you can already get the code from our source repository on CodePlex.
Waw, what a wall of text. Hope it makes sense to someone :-) And for your reference, with the CU7 optimization this whole process (db operations / import code / compile X++ / generate CIL / synchronize / export model) this runs in less than 40 minutes.
There is no comment section here, but I would love to hear your thoughts! Get in touch!
Blog Links
Blog Post Collections
- The LLM Blogs
- Dynamics 365 (AX7) Dev Resources
- Dynamics AX 2012 Dev Resources
- Dynamics AX 2012 ALM/TFS
Recent Posts
-
GPT4-o1 Test Results
Read more... -
Small Language Models
Read more... -
Orchestration and Function Calling
Read more... -
From Text Prediction to Action
Read more... -
The Killer App
Read more...