This page is designed to show the basics of building an autopackage for your software. It's not a replacement for reading the Packagers Guide, which is the reference manual for autopackage. If questions still exist after reading this, try there first.
The end result of this process is a .package file that can placed on your webserver and users can download and install your software with. If the package is something that can be a dependency of other software (for instance, a library) additional packages such as .package.payload and .package.meta will be generated also.
If your software uses QT/kdelibs, or just relies on large C++ libraries, then you must be careful. This is because of C++ Application Binary Interface (ABI) issues: GCC 3.4 broke C++ ABI (again), so software compiled with GCC 3.4 can mysteriously crash on GCC 3.2/3.3 systems, and vice versa. Because of this, we cannot guarantee that your software will run on all systems. At the time of writing, most distributions still use GCC 3.2, but GCC 3.4 distributions are coming and GCC 3.2 distributions are not going to disappear any time soon.
It is possible to make autopackages of KDE/Qt based software, however it requires more work and some of the tools we have developed may need adjusting. If you want to do this, talk to us first as this is still rather unexplored territory.
If you are starting a new project and wish to use C++ for your GUI, GTKmm is an alternative that is easily amenable to static linking as it's a C++ wrapper around a C library. This wrapper/binding approach is the one taken on Windows and most apps ship the bindings alongside themselves. It's also the approach taken by the Inkscape packages, and it works fairly well.
If a dependency is highly unstable (i.e. breaks backwards compatibility often), isn't widely packaged, or is difficult to get hold of, consider that the software could be imported into your source tree then merging with upstream at regular intervals. Alternatively, the package can statically link the dependency into your software when the autopackage is built. Remember that dependencies are a powerful tool, but should not be used to excess - if they cause more pain for your users than they save, it's best to avoid them.
A major factor in your choice of dependencies should be whether they are widely packaged or not. If they are not, then statically linking is one option, or try and convince the maintainers to build autopackages themselves. That way your autopackage can depend on their autopackage, and the infrastructure will take care of installing the dependency (package) if it is missing.
It may be that your software can operate in the absence of some libraries, but works
better if they are present. If so then make sure these features are not only enabled/disabled
at compile time (in configure scripts for instance) but also at runtime. If the software is using
C/C++, use the dlopen()
functionality to ensure your program can run in the absence
of this dependency. Because dlopen/dlsym is a rather inconvenient system to use, we produced
relaytool,
which allows a "soft link" against software with no code changes beyond a simple
if (libfoo_is_present) { ... }
conditional. There is no need to define lots of function
pointers, it works for C++ and it supports exported variables. If linking to libraries at runtime
does not look elegant because of the ugliness of the resulting code, relaytool is exactly
what is needed.
You can download these from the downloads page. If you're on an RPM
system feel free to grab the developer tools RPM but otherwise use the tarball: the content is
the same.
Firstly, create a directory in the root of your source tree called "autopackage". This is
where the specfile will reside. Then, in that newly created directory
run
makeinstaller --mkspec > autopackage/default.apspec.in
. This will create a new
specfile to tweak for your software easily. Because it's convenient to keep your
software version number in your configure script, the default "template" specfile uses a
@VERSION@ macro that autoconf will expand out, so remember to add
"autopackage/default.apspec" to the bottom of your configure script!
The [Meta] section contains information (metadata) about your software. This should hopefully be self explanatory, if it isn't we may need to put more comments in the template specfile so let us know! In brief:
See the Packagers Guide for more information. These scripts are the meat of your package. The [BuildPrepare] and [BuildUnprepare] sections are run only by the "makeinstaller" program and are responsible for building (compiling) your software, and to clean up the built files.
If your project uses autoconf/automake, then simple calling prepareBuild and unprepareBuild is usually enough: those functions will handle all the details automatically for you. If you're using a different build system, then those sections should contain whatever commands are necessary to build your software (i.e. "scons -Q" for Scons-based projects, or "make" for plain Makefile-based projects).
Your software is now compiled, that's good. But autopackage needs to know what files you want to put into
package. That's what the [Import] section is for. You use the
import
command to import files in your package.
The import command is invoked like this:
echo (filenames...) | importimport reads filenames from STDIN. Each file must be seperated by a newline. The command also accepts wildcards, so you can also type:
echo '*' | import
The current working directory is the build root directory ($build_root
).
If you used prepareBuild, then all built files are automatically installed
to this build root (by using 'make install'), so the only thing you have to do
is to call echo '*' | import.
If you're not using prepareBuild, then there are two things you can do:
$build_root
. After that, you can simple call echo '*' | import.$source_dir
.
For example, your project is based on a simple make-based build system. Make compiles superpig.c to the binary superpig. Your source directory also has wai.png, a data file used by superpig. You write this in your specfile:
import <<EOF $source_dir/superpig $source_dir/wai.png EOF
The prep script is run to prepare your package for installation. Any given software installation occurs in two stages: preparation and then installation. All the prepare scripts for the main package and any resolved dependencies are run together, then the install scripts are all run sequentially after that.
The prep script should perform any checks that affect whether your software can be installed or not. Typically this means checking for dependencies, but it doesn't have to be only this. To check for dependencies use the require and recommend functions. These functions take a root name and try to satisfy the dependency. Require will abort the install if it can't find/install the requested software, recommend will not. See some of the example specs to see how this works.
The require and recommend functions use skeleton files to do the dep checking. We provide a library of them in the autopackage developer tools. If you write new ones, send them to us!
The install script is where the files in the payload are copied to the system. This is done by using the autopackage API, which abstracts you from the base system. Don't use the regular shell tools like "cp": instead use the autopackage replacements. These add extra functionality like logging (so uninstall is automatic), and may silently "fix" things as the install proceeds to help with distro compatibility. Here is a list of the install APIs. Different file types have different APIs. Make sure you use the right API for the file type. For instance, don't use "copyFiles" for a .desktop file, instead use "installDesktop". Here is an example install tree layout, and the equivalent install script for it:
bin/foobar installExe bin/* lib/libfoobar.so.0 installLib lib/* (because it's a public library) lib/baz/plugins/foobar-plugin.so copyFile lib/baz/plugins/foobar-plugin.so $PREFIX/lib/baz/plugins/foobar-plugin.so share/foobar/some-random-datafile.dat installData share/foobar share/applications/foobar.desktop installDesktop "Accessories" share/applications/foobar.desktop share/locale/en_GB/LC_MESSAGES/foobar.mo installLocale share/locale man/man3/foobar.3 installMan 3 man/man3/foobar.3 info/foobar installInfo info/foobar
Common mistakes to avoid are: using a general API when a more specific one exists, copying the same files twice by accident, using installLib for shared libraries that are not actually meant for use with other programs (so it doesn't matter if the linker cache isn't updated), and forgotting to specify a manual category for installDesktop (that's needed for compatibility with pre-XDG versions of KDE).
Uninstall can almost always be left at the default.
Run the makeinstaller command in the source tree root. Some warnings might be displayed, but it should compile your program, run make install and finally generate a .package file. You may need to adjust the specfile at this stage. If the generated packages are (or could be) dependencies of other packages, use the -b option to generate payload/meta files as well. A repository XML file was also generated which represents your local repository of the generated package. The XML file should be placed on a webserver and the location of the file should match the skeleton file for the library. Writing skeleton files need some more documentation but details about the repository XML file can be found in the Packagers Guide.
Make sure your software is not installed already, then run the package. Did it install OK? Does
the software run? If so then put the package up for testing by a wider audience (as part of
your beta process, for instance). If it's all good release the package to the world. If
something breaks, try tweaking your specfile, or talking to us if more help is needed.
If you still have questions, please read the Packagers Guide, or feel free to contact us.