Beyond Brown

When brown just isn't enough

gcc 6.2 for Ataris brown edition - merry Christmas!

It’s the time of the season for love, caring and, most importantly, presents!

Brown your life

…so why did we come up with this lump of coal instead of something fun? :) Well, read on and you’ll find out, or for the tl;dr among you just skip to the bottom of the post for download links!

Reasons to use/upgrade to gcc 6.2

Here’s a few we came up with:

  • link-time whole-program optimisation (inter-module calls & inlining): can have a big impact in reducing the executable’s size.
  • improved optimiser:
  • C++14, C++17 support
  • named section support via C and asm
  • proper C++ initialise/finalise sequence
  • GST extended symbols
  • can preview code generation in Godbolt/68k
  • path open for future gcc releases

The problems (AKA why Vincent stopped at gcc 4.6.4?)

These days the 10-15 active people on Atari 16 / 32 systems (and compatibles) that cross build code for the machines use Vincent Rivière’s gcc 4.6.4 port, widely on many popular platforms. Which is all fine and well, but it raised a couple of questions for us:

  1. Why stop at 4.6.4?
  2. Any new features in gcc 5.x and 6.x we could take advantage?
  3. Do the later gcc versions produce better code?

After poking into things a bit, (1) was quickly answered: The magic of Atari support in gcc comes from binutils. This is a separate package that has all the base tools for creating an executable and libraries. In order to get Atari support (i.e. TOS executables), someone either had to write a TOS .PRG creator and send it upstream to binutils or abuse something existing. Some very clever people noticed similarities between the aout format and TOS’ header, so they patched binutils to generate TOS executables - and of course patch gcc so it recognises Atari as a target. Hooray! The only “snag” was that this patch set had to be ported across all gcc/binutils.

It was all going swell until the binutils team decided that the aout format is deprecated and they removed it. Ooops!

So the only format supported by modern gcc versions is Linux’s ELF, which is vastly different to PRG. We had a couple of ideas on how to overcome this.

The first one was to compile the latest gcc and use the old patched binutils. We’d simply ask gcc to produce assembly listings instead of object files and feed them to the old binutils. While that sounded like a good idea, in practice it became next to impossible to implement. The biggest hurdle was that the binutils assembler (gas) changed its syntax quite a bit (for example register names are now required to be prefixed with % - what we internally called the %%%%%%%%issue) so all listings would have to be patched just to be able to be parsed. Initially this didn’t seem too hard, but as more complex sources were tested the number of fixes required blew out of proportion. Coupled with the fact that gcc would produce more ELF centered assembly (different sections than the traditional TEXT/DATA/BSS etc) we quickly stopped considering this path.

So it was time to roll up the sleves and get to the real work…

The Big Brown Build (AKA Building gcc and libs)

The first bothersome thing was to automate gcc building a bit. We built gcc a great many times and had too many false starts and failures trying to find the “proper” way to build it.

Fortunately we had a good head start from Vincent since he provided a semi-automated script that had some sensible options. The Linux From Scratch online book also provided some very invaluable hints on how to configure gcc.

After an ungodly amount of time trying out all seemingly interesting combinations of switches we settled down on a set of chants and incantations that could more or less satisfy our needs.

But compiling gcc is the easy part. A compiler is no good on its own without supporting libraries. gcc comes with its own set of libs (glibc, stdlibc++v3 etc) which seemed to compile ok. Then we needed an Atari specific library, something that knows about TOS/MiNT system calls and can access them.

MiNTlib seemed the only mature and up-to-date library so we went with that. Which proved quite a bit of a hurdle! Remember that bit above where we mentioned that gas syntax changed? Well, guess how much inline assembly using the old syntax style MiNTlib has! A large amount of time was spent patching the code and creating scripts to auto-patch it so we won’t have to do it by hand each and every time. Also a very confusing issue was that gcc changed the default compilation standard from plain old C to gnu11 - lots of “interesting” errors were output and that also took a good chunk of time to figure out.

But even then it wasn’t done. During building MiNTlib produces the libraries we need but also tries to compile some tools, i.e. actual executables. We didn’t bother with that yet so they’re just built as ELF binaries. The hugest problem was that linking the binaries exploded really hard, with ld complaining about hundreds of not found symbols. Another gcc surprise! Somewhere along the way, in their infinite wisdom they decided that symbol names in object files should not begin with an underscore (what we internally called the _____period in the project), as it has been for tens of years. Now, try to link a library compiled with leading underscores (glibc) with one that doesn’t (MiNTlib)… After a fashion a gcc rebuilding from scratch stopped being novel ;).

Another MiNTlib pecularity we observed was that it would build 020+ libraries assuming that there is a FPU installed, something that not all Falcons are equipped with (of course that’s not true for TTs). A new target was added to the build system and every libary set is hopefully now copied in the correct multilib folders.

Having fixed all the above, we continued with the building, to be almost immediately ground to a halt building stdlibc++v3. Setting aside the matter that it took a huge amount of time to even figure out what to build (the joys of multilib) it would crash and burn building the actual code because of some misplaced switches in the generated build files. Also, due to our reduced feature set (exceptions? What are those? :)) some bits exploded very hard and took a substantial amount of time (and alcohol) to trace and fix. Fortunately we didn’t require to do many changes but it was really unexpected and confusing.

After that it was plain sailing for the rest of the libs (not that there were many left), so we finally had a complete gcc toolchain from start to finish.

…and it would still build ELF executables.

\o/ \o/ \o/


Initially we were going to use a Python script written by Greasemonkey (see Acknowledgements) which also utilised a custom ld build script. This worked fine for simple cases but we quickly ran into trouble with more complex (read C++) projects, so we decided to roll our own.

Well, nearly our own. A quick search for ELF parsers introduced us to ELFIO, a C++ header library which can be easily included into a project. Bundled with the library were a few nice examples so we just grabbed one and started hacking around it. Another library called simpleopt was also utilised to provide easy command line argument parsing.

We intended to call the program elf2st but that was really confusing since Greasemonkey’s tool is called stelf (the fun we would have when we had both tools installed and not remembering the exact name), so brownout it is.

In brief, the tool does the following:

  1. Reads the ELF file and parses all sections using ELFIO.
  2. Marks all sections that are applicable to TOS executables and sorts them depending if they are TEXT, DATA or BSS.
  3. Determines the program entry point.
  4. Marks all relocatable addresses in order to produce the TOS relocation table. Cross-section checks are also handled.
  5. Generates the GST symbol table ready for dumping. Extended symbol format is also supported. Local nameless labels are skipped. If enabled, C++ symbol names that contain tons of header characters are stripped.
  6. Dumps the TOS header.
  7. Dumps all TEXT and DATA sections.
  8. Dumps the symbol table.
  9. Sorts all relocations, omits double entries and corrects relocations
  10. Dumps the relocation table. In the case of no relocations, a longword of 0 is written instead.

Supporting different kinds of C/C++ project using BrownElf GCC

The move to GCC6 and the move to ELF both introduced some fun ‘changes’ to key aspects of project linking and initialisation. Some of these are more complicated than others, but here’s a rundown of what to expect.

This is a headstart only - actual source for the fussy cases will follow in an update release, probably after new year 2016-2017.

  • Can link plain C projects with or without MiNTlib, with minimal fuss. MiNTlib or other LIBC variants shouldn’t need any special sauce.

  • Bare-bones C projects require a bit of extra work.
    Pass -nostdlib -nostartfiles and start the link with crt0.o as the very first object. This implements __start, a call to _crtinit and pterm(), but not a lot more than this.
    You then need to implement your own baspage, MShrink and stack setup via a _crtinit function in 68k, and then jsr _main with a valid or null argc/argv. You will also need to provide some extra stubs for things like _exit, __cxa_atexit and __dso_handle, most of which you won’t even need to implement, depending on how bare your bones are.
    You may require libgcc (-lgcc) to be added to the link line depending on what your code is doing - e.g. for integer multiplies, some softfloat support etc.
    An example of this material will be provided shortly.

  • Can link bare-bones C++ programs - but requires some extra support files in addition to what is described for the C case above. It also requires a more complex link sequence. The extras implement __libc_csu_init(), __cxa_atexit() and __cxa_finalize() functionality needed for the newer style static initialisation sequence. You can’t use C++ ctors/dtors or statics without it.
    The extra files are nearly ready but weren’t settled enough for this xmas release. These will follow shortly.

  • Can link C++ programs with MiNTlib (or another LIBC). It is however necessary to modify crtinit.c to call __libc_csu_init(), in addition to what is described above for the bare-bones C++ case. At least you won’t need to do your own crtinit though (i.e. the basepage/Mshrink/stack stuff). Just add the missing call to the existing one.

Library CPU variants

When you select a CPU variant you need to specify the library search path as well. The builtin multilib patterns are not quite right yet so the installer reorganises them and they are not auto-selected by the compiler.

plain 68000:

        -L/lib/gcc/m68k-ataribrown-elf/6.2.0/m68000 \

68030 (no FPU):

        -L/lib/gcc/m68k-ataribrown-elf/6.2.0/m68020/softfp \

68030 + FPU:

        -L/lib/gcc/m68k-ataribrown-elf/6.2.0/m68020 \


        -L/lib/gcc/m68k-ataribrown-elf/6.2.0/m68060 \

Just remember to add $(LIBPATHS) to your link line in the correct place.


Linking ELFs to process with Brownout requires specific flags:

LDOPTS = -Wl,--gc-sections -Wl,--emit-relocs -Wl,-e__start -Ttext=0

In order to produce library-compatible code, ELF projects also need to include the -fleading-underscore switch at compiletime (ELF based compilers usually omit the leading underscore, including this one).

It’s also recommended that you pass -ffunction-sections and -fdata-sections at compiletime for maximum deadcode stripping when using --gc-sections at link.

Caution: If the entrypoint is not set properly (-Wl,-e__start), you’ll end up with a 0-byte ELF because code dependency scanning starts at the entrypoint.

Other stuff

It is possible to apply link-time-optimisation with GCC by adding -flto. When you do this, it is required that you pass all compiletime codegen flags to both the compile and link stages, so they don’t end up conflicting. You can end up with broken executables if LTO optimizes for the default 020 CPU when the compiled code is using -m68000 for example. Assign all codegen flags to a variable and pass the variable at all stages.

-flto can produce some weird and confusing link errors. These are usually real errors in the project. Try to identify and sort them out without -flto first. Some will only be visible as warnings without -flto but do provide clues to help fix the underlying cause.

Use -fomit-frame-pointer for more efficient code - but be aware that trying to perform Super() switches using osbind/sysbind C wrappers can produce very broken programs in this mode, by confusing the compiler’s stackframe. Better perform super switches in asm and/or via Supexec().

GCC seems to access gobal variables directly most of the time. Try to use locals as far as possible. Otherwise you can try adding -fPIE to generate register-relative code if globals are a common case in your project. This eliminates all direct global addressing and is generally more efficient than pedantically PC-relative code but does eat an extra frame An register so beware…


BrownElf GCC emits ELF formatted objects. If you want to use external assemblers like VASM/RMAC you need to export ELF format .O files. AOUT will not work anymore.

GCC Inline Assembly

Inline assembly syntax has changed a bit. Registers now require a %% prefix.

GCC4 seemed not to require this, so some old code will likely need fixed.

Take extra care over clobber lists. You may not get warnings or errors if they are not formatted properly.

Online version

Too bored to download and test it on your own? Matt Godbolt has you covered! His neat gcc-explorer enables you to write snippets of code and see directly how they translate into assembly language! For reference we also compiled Vincent’s 4.6.4 gcc and Peylow’s version with extra ABI that passes arguments in registers (although we’re not sure this works yet, anyone knows anything about this - get in touch!Peylow confirmed it personally, passing -mfastcall enables the ABI).


(standing on the shoulders of giants, or maybe sideways)

  • Guido Flohr made the original efforts for porting gcc 2.x to Ataris and made the first binutils patch which everyone after based theirs.
  • Patrice Mandin carried the torch for 2.x and 3.x, his work is here.
  • Keith Scroggins ported gcc 4.0.1 to MiNT.
  • Olivier Landemarre made one of the first gcc 4.2 ports.
  • Vincent Rivière pushed Guido’s (and the rest) patches to the limit (4.6.4), his gcc archives is the de facto standard cross compiler for Ataris.
  • Miro ‘mikro’ Kropacek has built native versions of gcc 4.6.4 for 68000 / 68020+ / m5475 targets.
  • Armin ‘ardi69’ made some really interesting work akin to what we did but for gcc 4.8.2. It’s quite close to what we did, without us knowing about it at all!
  • Fredrik ‘Peylow’ Olsson created a gcc 4.6.4 fork which implements the fastcall attribute. Sadly we never figured out how it works :(. If anyone knows, let us know! Thanks to mikro for pointing out the -mfastcall switch - check out the online bot for an example!
  • Ben ‘Greasemonkey’ Russel created a python script and linkfile for converting elf binaries to TOS.
  • A great page about the ELF format which proved invaluable during development of brownout.
  • elf2flt is a project that’s very close to brownout. It’s made to convert ELF binaries into “flat” format, so with a bit of work brownout could be integrated into that. Any volunteers? :)

Get it here

We will try to provide as many packages as possible for people here, but take note that we don’t have infinite resources and building and packaging is a bit boring!If anyone built it for other distros and want to host the files online, get in touch and we’ll update the list here.

  • MinGW/Msys build: gcc, binutils, mintlib, brownout (this package works under plain windows, you don’t even need to install Msys/MinGW if you don’t want to. But then you will need some extra dlls from this package)
  • Cygwin x86 build: gcc, binutils, mintlib, brownout (requires Cygwin of course)
  • Linux Mint 17.2 x64 build (Ubuntu 14.04 compatible, probably): gcc, binutils, mintlib Binaries removed
  • Linux Mint 18 x64 build (Ubuntu 16.06 compatible, probably): gcc, binutils, mintlib binaries removed
  • MacOS Sierra build courtesy of Troed Sångberg (Troed of SYNC): Click here and read “readme.txt” for instructions.
  • If you want to build from source, the build script is hosted on bitbucket here. brownout is here. Both are mercurial repositories, so feel free to clone them or download them as a .zip package from Bitbucket itself. Let us know if you have any issues if you try it in your system. Patches are welcome!

Quick installation instructions:

  • Unpack gcc, binutils and mintlib in the root folder: Not providing binaries any more
    • cd / && sudo tar -jxvf /path/to/your/gcc-6.2-ataribrown-XXX.tar.bz2 Not providing binaries any more
    • cd / && sudo tar -jxvf /path/to/your/binutils-2.27-ataribrown-bin-XXX.tar.bz2 Not providing binaries any more
    • cd ~ && sudo tar -zxvf /path/to/your/mintlib-0.60.1-bin.tar.gz. Now copy folders "include", "lib", "lib020", "lib020_soft", "lib020-60", "lib020-60_soft", "lib040" and "lib4e" to /usr/m68k-ataribrown-elf. Not providing binaries any more
  • Optional but highly recommended: Run as both gcc and mintlib’s default install directories leave a lot to be desired (i.e. mixing directories and names). Not needed any more, the build script takes care of this
  • Compile and link your code with m68k-ataribrown-* programs as usual. This should produce an .elf file.
  • Use brownout to convert the .elf file to TOS executable.

That’s all!

We wish you a Merry Christmas and a happy new year 2017!


  • 29/12/2017: Fastcall ABI figured out - thanks to Mikro!
  • 12/01/2017: Mintlib archive updated to include math.h, please re-download if it’s not present for you.
  • 13/03/2017: Added link to MacOS build.
  • 18/04/2017: Corrected install instructions.
  • 20/09/2017: Linux binaries removed.
  • 19/08/2020: Links updated.

The brown duo