Advice for the porting engineer


This page lists good advice to follow as well as traps and pitfalls encountered when adding support for new platforms. Unfortunately, it is still easy to abuse the system under dead line pressure (like hard coding directories on the porting engineer machine). Following this good advice may take a bit longer at first but saves you time later on.

Check this page to remember the porting effort overview.

Back to Porting engineer
Back to SBuild manual home

Table of content
A story of 3 environments

While working to port software using SBuild, there are 3 items that are called in short "environment" and that creates confusion. This is actually a source of confusion inherited from SCons (none of these 3 is SBuild-specific concept). Here is the list of the three, how are they used in code and what is the relation ship between them.

  • The SCons Construction Environment: this is a Python entity. It is an instance of the class Environment() provided by SCons code. It is not a shell environment. For all practical purposes, it is a dictionary (also called a map or a hash in some other programming languages). It is a quite large dictionary. The keys of this dictionary are documented in the SCons documentation. It is usually encountered in code as env (but that's a very weak convention, since it's just the name on an internal Python variable). The Construction Environment is pretty much the same whatever your build machine.
  • The shell local environment: this is a shell environment, precisely the environment of the shell where you start a build script. That differs from build machine to build machine, from user to user on the same machine and even from one text terminal to another text terminal for the same user on the same machine. This can be seen in code as os.environ, which is the way Python interpreter let you access the shell environment where the Python process has started.
  • The run-time shell environment: this is also a shell environment, precisely it is the environment when an external tool is fired by SCons through a shell in a separate process. In SCons parlance, when a process is spawned. This environment is encountered in code as env["ENV"], meaning it is itself one entry in the SCons construction environment, entry named ENV. Several important observations:
    • This run-time environment is different from the shell local environment. Usually the first is much smaller than the second (with less entries).
    • The run-time shell environment is built by SCons from scratch at every build. The purpose is (and the end effect is) better reproducibility of the build across user, across build machines, etc.
    • Things in os.environ that you want available in env["ENV"] have to be explicitly propagated. You have to provide code for that to happen. Despite many people expectation, it is not automatic and it is good that it is not automatic.

One word about this frequent expectation. It goes like this: "Hey, I changed the shell variables and my build doesn't see it. Look my older make-based build works just fine, your build scripts are broken". In fact, that's a culture crash, a problem of unlearning the old and lousy way and learning the new and controlled way. Their expectation is broken, not the build tool, but few are ready to admit it. Many tool chain providers ship a shell script to set the shell environment "right" (Unix style) and some manipulate the default shell environment directly from the installer (MS Windows style). SCons forces the porting engineer to understand what those changes are, what shell variables are important and in which way for the executables that he is running to perform a build. He then will code those variables in the SCons Tools in ways that are more or less robust with respect to a build machine change (see also the page on SCons Tool writing).

One more word about debugging: You can print out in your SCons tools parts or the entirety of those environments. Although other SCons Tools initialization may follow later than your print and change them, you still have there a simple way to see if any intuition you had is not exactly true. You can of course also use a Python debugger to inspect variable content but many C/C++ porting engineers are not familiar with Python debuggers available.

A last word for the quick hare in the night before the dead line: It is possible to copy the os.environ to the env["ENV"] inside the SCons tool. It is certainly not recommended but it is possible. It looks like a shortcut because it "saves" the effort of understanding. Once  such shortcut is in, usually it doesn't pollute other builds (it may later on, if your setup is cloned to support other platforms).

Back to top

Portability of build scripts: to avoid

Just put settings wherever

The most frequent pain of porting C/C++ software is quick and dirty shortcuts adopted by the previous engineer. It's true for the code it's true for the build description. It is not a problem of tooling per se (although make tool "global everything" approach greatly contributed to spreading the mess). It is problem of software development process. Many porting efforts are one off efforts. No follow up, no contribute back, actually not a lot of care in anything, including the build description. So we shouldn't be surprised to see porting engineers "stop at the first thing that works" and then religiously defend against any change of that part, even if the entire world has to bend to match their part "that works".

Let's take an example. The C/C++ preprocessor takes on the command line open defines (those symbols used in #ifdef and alike). They carry information for the product you build, some information is build site specific, some is build machine specific, most of it is product specific, etc. Stopping at the very first thing that works means, in this case, finding a place where you can place an open define and then adding your open define there, without any concern if you just added a build machine wide setting to a list of product specific settings.  You may say, "Look, it arrives on the command line in the end and the product build is correct".

Ironically, the high price for this spoiled build description change is paid by porting engineers themselves. For example, one starts by assuming that his target platform is supported and that he only needs to build that new product only to notice that the target platform support is polluted with settings that are specific to another product that he doesn't care about. Or he notices that the rls build of the product is fatally wrong just because some essential settings have been mistakenly placed where settings for the dbg build and only for dbg build where expected.

So, by respect for your colleague, don't throw in settings wherever. The C/C++ development environment is such that it looks like "working OK" even if you did a very lousy job. Your purpose is not merely to get options somehow on the command line of the executables but to add value to the product that you port. That may help somebody else but that may turn out to help yourself months or years down the road. Remember, in software engineering, shortcuts always take longer in the end. 

Back to top

Forget the Platform SDKs separation

C/C++ software, ported or not, gets linked with some low-level precompiled code (in the vast majority of products). This code is usually provided by the provider of the tool chain. It is frequently denoted as "the system libraries", which is vague and confusing. More precisely, we distinguish in this code:

  • the C or the C++ standard library, as well as proprietary replacement or extensions. That includes C/C++ headers and compiled code.
  • the Operating System code (when there is an OS). That includes C/C++ headers and compiled code.

The important thing to note is that this code, although it lives somewhere on your build machine, it is an integral part of the target platform. It is 100% specific to the machine you build for and it is only very remotely specific to the machine you build on (mainly only file locations).

SBuild names collectively this code the Target Platform Software Development Kit (Platform SDK in short). Precisely, that is the set of C/C++ headers and the set development-time precompiled libraries that you have to use in order to build a complete executable for some target platform.

The most frequent error that someone adding support for a new platform may do to ruin the portability of the build descriptions is to tie the Platform SDK to the tool chain used to build. That is very tempting because, most of the time, the tool chain and the platform SDK are both parts on one and the same installation package. This locking together of the tool chain and the Platform SDK doesn't hurt (so it passes unnoticed) until the first cross compilation or until the first competing tool chain. Usually that is too late (meaning making progress from that point involves some painful reimplementation of the platform support in the build tool, whatever the the build tool). SBuild provides to you the framework to separate from day one, with its tgtplatform and toolchain variants, although the SCons below doesn't really help with the separation.

"How can I separate?" you may ask. "I have one installer that puts everything in one place." Let's say that place is /usr/localbin/greattoolchain or c:\Great Tool Chain (with spaces in the name). We'll just call that <InstallDir>. Let's now assume the following:

  • The executables of the tool chain are in a directory <InstallDir>/bin
  • The C/C++ headers of the Platform SDK are in a directory <InstallDir>/includes
  • The precompiled code of the Platform SDK are in a directory <InstallDir>/arm11/2.14

Given that, separation simply means making up those path in your SCons tool out of several parts. For example, the SCons tool will get somehow the <InstallDir>, put it in a variable root_install_dir and then make the path to the libraries to link with as root_install_dir +os.sep+ platform_lib_dir, where platform_lib_dir may be arm11/2.14 but also something else according to the value you chose for tgtplatform (on the command line or otherwise).

As a side note, the confusion started for many developers back in school, while learning programming using some Microsoft Visual Studio version. Nowadays, Microsoft is trying to  shake off that habit, forced by the multiplication of the target platforms their developers face: .Net platforms (newer and older), several desktop Windows OSes, many Windows CE/Mobile target platforms, etc. That means they ship the development environment and the various Platform SDKs as separate installers.

Back to top

Invent new variants redundant to tgtplatform and toolchain

For all non-trivial software products, it is practically impossible to not have any adaptation to the target platform or the build platform somewhere in the code or in the build description. In fact, many of the code bases around have already their set of settings (usually open defines) to model different platforms (or group of platforms) that the software is supposed to run on. Therefore the temptation is big to create new SBuild variants to carry that information.

You will be better off to refrain from that. Use the ***_per_platform target attributes and, when not enough, use "if" construction at the end of target specifications testing existing variants. For example:

lextree_modification_exe = sb.T(
   name = 'lextree_modification',
   type = 'exec',
   desc = 'Sample showing modification',
   ccpp_defines = [ 'acmod_size_800' ],
if sb.vars.toolchain.get()[0:5] == 'msevc': 
   lextree_modification_exe.linkflags_per_platform = {sb.vars.tgtplatform.get():['/entry:mainWCRTStartup',],}

Back to top

Portability of build scripts: to do
  • Assert paths: Ideally, your SCons Tools will not contain any hard coded paths, meaning all paths are detected of computed somehow (for example, from some OS registry). In practice, it is impossible to completely avoid hard coded paths (sometimes it is even desired to fix a path, see also paragraph on automatic detection). In any case, the golden rule is to check the path. You should assert that the path exists on disk and give a meaningful message about what and why was expected.
  • Print out what you detect. The more complex the code to setup the tool chain the more important becomes to print out the information: what executables are used and from what location, what run-time shell environment are prepared and what is their purpose. This common sense rule is reminded here because it is the opposite of what you'll typically see in existing SCons Tools from the SCons distribution.
  • Keep a list of tgtplatforms supported by your product. You may check, for example in, what tgtplatform is used currently and abort the build if not in a list of known-to-work platforms. If and only if a developer is in the process of porting the product to a new platform, he/she is allowed to extend that check. 
  • Use direct args for experiment builds. SBuild provides way to pass arguments directly from the command line of the build script to the command line of several executables used to build. That's the direct_args_*** set of keyword arguments to SBuild scripts. Use them to try out things without touching any file. Do not introduce new values for bldopt or tgtplatform just because you wanted to try out the effect of one compiler option.

Back to top

Advanced tricks

Changing the C/C++ variants in the SBuild project

In many situations you may have some settings that are not target specific, yet not generic enough to place them in the C/C++ toolkit. The scope of these settings may be vaguely defined as "all the C/C++ builds that I care about but certainly not all the C/C++ builds in the world". Examples:

  • Your own convention for what is a debug versus a release build
  • Site specific settings, like a network shared location for C/C++ compilers

The answer of SBuild to these situation is the modification of the variants provided by the C/C++ SBuild toolkit in the SBuild project of one product (a.k.a in the SBuild package). As a reminder, these variants are tgtplatform, toolchain and bldopt.

So, where do you code such settings? Easy, the Python files that are shared by all build scripts in a project are: and (also but that one is "reserved", not supposed to be changed by you).

How do you code those settings? By accessing and changing attributes of the variants. The variants are Python objects that you can get in your code as sb.vars.tgtplatform, sb.vars.toolchain and sb.vars.bldopt respectively.

Here is how to add some C/C++ preprocessor defines of your own to all builds (in a given SBuild project).

#default defines per build option (rls/dbg)
default_ccpp_defines_for_dbg = ["MEM_DEBUG",]

bldopt_current_value = sb.vars.bldopt.get()
toolchain_current = sb.vars.toolchain.get()
if bldopt_current_value.startswith("dbg"): # note we use startswith("dbg") instead of == "dbg"
  sb.vars.bldopt.ccpp_defines_per_toolchain[bldopt_current_value][toolchain_current] += default_ccpp_defines_for_dbg
elif bldopt_current_value.startswith("rls"):
  assert 0, "Don't know what product specific defines for this value of bldopt: "+bldopt_current_value

The code above, taken from one file, lets you maintain one list, default_ccpp_defines_for_dbg, with the defines you want on all dbg builds. The code changes sb.vars.bldopt.ccpp_defines_per_toolchain (you may need some more code, just make sure you can use +=). You may easily add default_ccpp_defines_for_rls, if you want.

Here is how do you change the compilers used for builds for a Solaris machine:

# our Solaris GCC setup
sb.vars.tgtplatform.scons_constr_var_to_set['sparc_solaris'].extend( [

The code above, taken from one file, shows how to change SCons Construction variables, CC and CXX. Notice the fact that absolute paths are used, in order to make sure than on all build machines one and the same compiler is used.  Small detail, the change is not conditional (it happens also if you are not building for the sparc_solaric target platform).

If you wonder, you may indeed make such changes in or in at your choice. This is because the C/C++ SBuild toolkit (tk_ccpp) is loaded in memory before these files of the SBuild project. If you want to remember what is the different purposes of and of you may check this page.

Back to top

Providing a "main" to your SCons tool

In case your SCons tool performs complex detection work, it makes sense to let it print out somehow all the possibilities that are available on the build machine. For example print out the content of the Symbian SDK configuration file or print out all the MS compiler versions and platform SDKs found in the MS Windows registry of this machine.

One way to do that is to provide a "main" section to your SCons tool so that running an SCons tool by itself prints out that information. Here is an example for the file (the SCons Tool for the Microsoft C/C++ compiler for desktop Windows platforms).

At the top of the file:

if __name__ == '__main__':
    import sys
    sys.path.append( os.path.dirname(os.path.abspath(sys.argv[0]))+r"\..\.." )
    import SCons

At the bottom of the file:

if __name__ == '__main__':
    env = SCons.Environment.Environment()
    version = SCons.Tool.msvs.get_default_visualstudio_version(env)
    print 'Found version as default:',version
    include_path, lib_path, exe_path = _get_msvc7_default_paths(env, version, 1)
    print 'Default include path:',include_path
    print 'Default lib path:',lib_path
    print 'Default exe path:',exe_path

So now you can just run on the command line by itself and see for yourself what does it find on your build machine.

Back to top

Back to Porting engineer
Back to SBuild manual home