Discussion:
GNU/GCC optimizing
(too old to reply)
Francois LE COAT
2015-09-30 21:45:11 UTC
Permalink
Hi,

I'm developing since 1987 a shareware called Eureka 2.12 on different
successive ATARI computers, and since GNU/GCC 2.8.1 was released, I
also use that C language compiler under freeMiNT, the ATARI free OS.

My C sources are perfectly compatible with GNU/GCC until 3.3.6 version.
But since GNU/GCC 4 optimizing compiler is available, I can't build my
sources successfully. Large parts of what it's written in C language
are completely discarded from the binary even if I use the "-O0" option.

Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
build strictly what it is written in C, preventing from optimizations ?

For the moment, the Eureka 2.12 software can be built, but the binary
is not corresponding to sources, and is totally misbehaving with
GNU/GCC 4, because my sources are misinterpreted. GNU/GCC 3 is correct.

I was imagining that C language is offering sources compatibility, but
since the fourth version of GNU/GCC, it isn't the case apparently with
my old sources. The problem seems to be caused by the compiler itself,
and not the corresponding libraries, because I even tested GNU/GCC 4
with the exact same libraries than with GNU/GCC 3 ...

Thanks in advance for helping me.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Miro Kropáček
2015-10-01 06:18:16 UTC
Permalink
Post by Francois LE COAT
sources successfully. Large parts of what it's written in C language
are completely discarded from the binary even if I use the "-O0" option.
Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
build strictly what it is written in C, preventing from optimizations ?
What does that even mean? How discarded? You rely on dead code?

The main change in the 4.x series is introduction of so called strict aliasing, see for instance here: http://cellperformance.beyond3d.com/articles/2006/06/understanding-strict-aliasing.html. You can disable it via a command line switch, however, this is recommended only as a temporary workaround.

Also, some code example would clarify more.
Francois LE COAT
2015-10-01 19:41:49 UTC
Permalink
Hi MiKRO,
Post by Miro Kropáček
Post by Francois LE COAT
sources successfully. Large parts of what it's written in C language
are completely discarded from the binary even if I use the "-O0" option.
Is there a way to force GNU/GCC 4 (I'm experimenting 4.6.4 version) to
build strictly what it is written in C, preventing from optimizations ?
What does that even mean? How discarded? You rely on dead code?
The main change in the 4.x series is introduction of so called strict aliasing, see for instance here: http://cellperformance.beyond3d.com/articles/2006/06/understanding-strict-aliasing.html. You can disable it via a command line switch, however, this is recommended only as a temporary workaround.
Also, some code example would clarify more.
Well, I took GNU/GCC 4.6.4 available on your WEB page at :
<http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint.tar.bz2>
precisely `cc1` binary available in this archive, because I have a
Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
with the C compiler from 4.6.4 version.

With this development configuration, I successfully built Eureka 2.12,
with not so much warnings. The problem is the resulting binary is not
conform to sources. Please keep in mind that my Eureka 2.12 requires
the "-mshort" option from GNU/GCC for PURE C 1.1 compatibility issue.

This weird manipulation was intended to prove that the compatibility
problem from GNU/GCC 3 to GNU/GCC 4 is only due to the `cc1` compiler.
Also, with the available 4.6.4 version, there's no 16bits libraries.

I also put in the `Makefile` a "-O0" compiling option. The problem is
that resulting binary is misbehaving compared to the 3.3.6 compilation.

I don't know what to do. I also tested the cross-compiler under OS X.
My sources seem to be totally misinterpreted by GNU/GCC 4, I'm afraid.

You may have an opinion ...

Thanks for your answer.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Miro Kropáček
2015-10-02 06:12:42 UTC
Permalink
Post by Francois LE COAT
<http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint.tar.bz2>
precisely `cc1` binary available in this archive, because I have a
Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
with the C compiler from 4.6.4 version.
What if take the whole package? There may be some internal dependencies, there's no guarantee that cc1 stays binary compatible with previous versions.

Also, you still didn't explain what do you mean by that 'the resulting binary is not conform to sources'.

But you're right, there's no -mshort libc & friends, only basic libgcc (for building the freemint kernel).
Francois LE COAT
2015-10-03 14:07:01 UTC
Permalink
Hi MiKRO,
Post by Miro Kropáček
Post by Francois LE COAT
<http://mikro.naprvyraz.sk/files/gcc/gcc-4.6.4-m68020-60mint.tar.bz2>
precisely `cc1` binary available in this archive, because I have a
Hades060 machine. I replaced the `cc1` binary from my GNU/GCC 3.3.6
configuration available at <http://eureka.atari.org/gcc3.3.6SDK.zip>
from your `cc1` binary. I have then a GNU/GCC 3.3.6 configuration,
with the C compiler from 4.6.4 version.
What if take the whole package? There may be some internal dependencies, there's no guarantee that cc1 stays binary compatible with previous versions.
If I take the whole package, I would have the same result like with
cross-compilers. The building configuration is not adapted to compile
my Eureka 2.12 software, because there's no 16bits libraries. Also
GNU/GCC 4 is an optimizing compiler, that misinterpret my sources.

It seems to me that `cc1` stay compatible from a version to the other.

The developing configuration that I'm telling you about successfully
builds my software, with correct 16bits libraries from 3.3.6 version.
That's why I can tell that `cc1` compiler is not backward compatible.
Post by Miro Kropáček
Also, you still didn't explain what do you mean by that 'the resulting binary is not conform to sources'.
Well, the starting demo with a spinning hypercube is not played. The
GEM interface seems correct, but if I want to describe a curve, the
curve is not drawn. If I want to draw a surface, the surface is
not drawn. Nothing happens with the binary like it should. The
binary is simply not corresponding to sources. The program is broken.
Post by Miro Kropáček
But you're right, there's no -mshort libc & friends, only basic libgcc (for building the freemint kernel).
The lack of 16bits libraries is a big default when I build Eureka 2.12.
It breaks compatibility with earlier ATARI developing configurations. I
didn't know the freeMiNT's kernel still uses the "-mshort" option. Many
ATARI programs are certainly using it because it is an ATARI convention.

Thanks for helping.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Michael Schwingen
2015-10-03 22:20:57 UTC
Permalink
Post by Francois LE COAT
Well, the starting demo with a spinning hypercube is not played. The
GEM interface seems correct, but if I want to describe a curve, the
curve is not drawn. If I want to draw a surface, the surface is
not drawn. Nothing happens with the binary like it should. The
binary is simply not corresponding to sources. The program is broken.
Im my experience with optimizing compilers, in most such cases the fault is
not with the compiler, but instead the source code is broken, doing things
that are not allowed by the C standard and relying on undefined behaviour.

If it worked with the older compiler (that had a weaker optimizer) that does
not mean anything for the correctness of the source code.

Did you compile with all warning enabled, and look at the warnings? I can't
believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
without producing at least some new warnings!

cu
Michael
Francois LE COAT
2015-10-04 13:15:31 UTC
Permalink
Hi,
Post by Michael Schwingen
Post by Francois LE COAT
Well, the starting demo with a spinning hypercube is not played. The
GEM interface seems correct, but if I want to describe a curve, the
curve is not drawn. If I want to draw a surface, the surface is
not drawn. Nothing happens with the binary like it should. The
binary is simply not corresponding to sources. The program is broken.
Im my experience with optimizing compilers, in most such cases the fault is
not with the compiler, but instead the source code is broken, doing things
that are not allowed by the C standard and relying on undefined behaviour.
If it worked with the older compiler (that had a weaker optimizer) that does
not mean anything for the correctness of the source code.
Did you compile with all warning enabled, and look at the warnings? I can't
believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
without producing at least some new warnings!
You'll agree that it's very peculiar ... How bizarre a warning will
generate an error when building C program's sources ? This is not
a warning, but should be alerted as an error, don't you think so ?
I never seen before in the C standard definition that a warning
should imperatively be taken into account, otherwise generating an
error. The warnings must often, used to generate good code, not errors.
Strong optimizations mean that warnings are now considered as errors ?

Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.

Thanks for your answer.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Michael Schwingen
2015-10-04 18:19:00 UTC
Permalink
Post by Francois LE COAT
You'll agree that it's very peculiar ... How bizarre a warning will
generate an error when building C program's sources ? This is not
a warning, but should be alerted as an error, don't you think so ?
No. The C standard gives compiler writers considerable room in which way
certain details may be implemented. If your source code uses constructs with
undefined results, the results will be - well, undefined, and may change
with compiler versions.
Post by Francois LE COAT
Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.
It has - K&R left many things undefined which were better specified in ANSI
C, however, even K&R had language details which caused undefined behaviour
by definition, and which may now cause unexpected results due to improving
compiler optimizations that uncover faults that were always there.

As a start, have a look at
http://blog.regehr.org/archives/213
http://blog.llvm.org/2011/05/what-every-c-programmer-should-know.html

cu
Michael
Francois LE COAT
2015-10-04 20:30:21 UTC
Permalink
Hi,
Post by Michael Schwingen
Post by Francois LE COAT
You'll agree that it's very peculiar ... How bizarre a warning will
generate an error when building C program's sources ? This is not
a warning, but should be alerted as an error, don't you think so ?
No. The C standard gives compiler writers considerable room in which way
certain details may be implemented. If your source code uses constructs with
undefined results, the results will be - well, undefined, and may change
with compiler versions.
Post by Francois LE COAT
Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.
It has - K&R left many things undefined which were better specified in ANSI
C, however, even K&R had language details which caused undefined behaviour
by definition, and which may now cause unexpected results due to improving
compiler optimizations that uncover faults that were always there.
As a start, have a look at
http://blog.regehr.org/archives/213
http://blog.llvm.org/2011/05/what-every-c-programmer-should-know.html
Well, the same recipe should at least give the same meal. If somebody
cooks a pizza, he is not supposed to eat a tomato ketchup. Except if
he is an extremely bad cooker. The recipes are the C language sources,
and the cooker is the C language compiler.

What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
PURE C 1.1 and other compilers, building my C program Eureka 2.12,
when the produced binary is such a messy meal ?

Thanks for your answer.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Michael Schwingen
2015-10-06 16:38:12 UTC
Permalink
Post by Francois LE COAT
Well, the same recipe should at least give the same meal.
Not if the recipy is unclear, like "take a spoonful of whatever red
ingredient you find in the fridge".
Post by Francois LE COAT
What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
PURE C 1.1 and other compilers, building my C program Eureka 2.12,
when the produced binary is such a messy meal ?
That your source is a messy meal, when judged by the current C standards
that the compiler implements - for good reason (it gives much better
performance when compiling standard-compliant sources). And no, the compiler
is working as expected (by the standard) - it it giving you one kind of
correct result, just not the one you want.

If you insist on keeping your source code as it is, you are limited to old,
badly-optimizing compilers that will by chance get the results you want.

Otherwise, you need to take a look *where* the problems are (as I said,
compiling with full warnings enabled, understanding and removing them is a
good start, but deeper debugging may be required). The links I posted should
provide a good start to understand what constructs are to be avoided when
using modern compilers.

cu
Michael
Francois LE COAT
2015-10-06 19:51:30 UTC
Permalink
Hi,
Post by Michael Schwingen
Post by Francois LE COAT
Well, the same recipe should at least give the same meal.
Not if the recipy is unclear, like "take a spoonful of whatever red
ingredient you find in the fridge".
Post by Francois LE COAT
What should I think about GNU/GCC 4 compared to GNU/GCC 2 and 3,
PURE C 1.1 and other compilers, building my C program Eureka 2.12,
when the produced binary is such a messy meal ?
That your source is a messy meal, when judged by the current C standards
that the compiler implements - for good reason (it gives much better
performance when compiling standard-compliant sources). And no, the compiler
is working as expected (by the standard) - it it giving you one kind of
correct result, just not the one you want.
If you insist on keeping your source code as it is, you are limited to old,
badly-optimizing compilers that will by chance get the results you want.
Otherwise, you need to take a look *where* the problems are (as I said,
compiling with full warnings enabled, understanding and removing them is a
good start, but deeper debugging may be required). The links I posted should
provide a good start to understand what constructs are to be avoided when
using modern compilers.
I think the GNU/GCC 4 compiler is too restrictive. There's a lot of
other C compilers on ATARI computers, I tested many of those, which are
not so rigorous. If you want to eliminate all chance there's any
misunderstanding with the previous generation of compilers, you can
use the -pendantic option. I think GNU/GCC 4 is naturally "pedandic".

The problem is that it eliminates a large amount of old C code,
that becomes "obsolete". Obsolescence is the worse of catastrophes
that severely impacts the informatics industries, because it gives
better profits to the giants of this lucrative business.

I'm surprised that GNU foundation is encouraging that kind of business.

Thanks for your answer.

Regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Michael Schwingen
2015-10-07 06:55:12 UTC
Permalink
Post by Francois LE COAT
I think the GNU/GCC 4 compiler is too restrictive. There's a lot of
other C compilers on ATARI computers, I tested many of those, which are
not so rigorous.
Then have a look at gcc5, which has improved optimization again - gcc 4.0 is
now 10 years old.

On small ARM systems, I have realized big improvements in code size by using
the link-time-optimization feature of gcc5.

cu
Michael
David Wade
2015-10-04 20:20:50 UTC
Permalink
Post by Francois LE COAT
Hi,
Post by Michael Schwingen
Post by Francois LE COAT
Well, the starting demo with a spinning hypercube is not played. The
GEM interface seems correct, but if I want to describe a curve, the
curve is not drawn. If I want to draw a surface, the surface is
not drawn. Nothing happens with the binary like it should. The
binary is simply not corresponding to sources. The program is broken.
Im my experience with optimizing compilers, in most such cases the fault is
not with the compiler, but instead the source code is broken, doing things
that are not allowed by the C standard and relying on undefined behaviour.
If it worked with the older compiler (that had a weaker optimizer) that does
not mean anything for the correctness of the source code.
Did you compile with all warning enabled, and look at the warnings? I can't
believe old, misbehaving code would compile after a gcc3 -> gcc4 switch
without producing at least some new warnings!
You'll agree that it's very peculiar ... How bizarre a warning will
generate an error when building C program's sources ?
This is typical of "C" and is one of the problems with the language,
Most compilers do not warn about undefined behaviour. Try posting this
on comp.lang.c and you will be told exactly the same.
Post by Francois LE COAT
This is not
a warning, but should be alerted as an error, don't you think so ?
I never seen before in the C standard definition that a warning
should imperatively be taken into account, otherwise generating an
error. The warnings must often, used to generate good code, not errors.
Strong optimizations mean that warnings are now considered as errors ?
The "C" standard has changed substancially with the introduction of C11.
I work on the Hercules project which is an IBM Mainframe emulator and we
are having the same problems...
Post by Francois LE COAT
Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.
Possibly need adjustments to cope with changes...
Post by Francois LE COAT
Thanks for your answer.
Best regards,
Dave Wade
G4UGM
Francois LE COAT
2015-10-04 21:35:50 UTC
Permalink
Hi,
Post by David Wade
Post by Francois LE COAT
Please take into account that I practice C language since 1986,
first Kernighan and Ritchie, then ANSI C standard. Notice the
C standard must have evoluted because my C sources are now obsolete.
Possibly need adjustments to cope with changes...
I'm afraid if I make that changes, my Eureka 2.12 sources will only
be compatible with GNU/GCC 4, and not any other C compiler. I had to
cope on ATARI computers with Lattice C, Turbo C, PURE C, GNU/GCC
2.x then 3.x etc. I've never seen such a rigorous C compiler as
GNU/GCC 4. It produces not any error with the sources, but the
binary is a complete mess ...

I prefer thinking my sources are obsolete, and my program runs on ST :)

Thanks for your answer.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Miro Kropáček
2015-10-05 09:36:00 UTC
Permalink
Post by Francois LE COAT
I'm afraid if I make that changes, my Eureka 2.12 sources will only
be compatible with GNU/GCC 4, and not any other C compiler.
This is not true. On the contrary, most likely you'll discover hidden bugs in your code. And of course the source will stay compatible, as mentioned, the compiler is now more strict but that doesn't mean it doesn't adhere to the C standard.

So it boils down to the fact whether you're willing to change your code or you're stuck with 3.x forever. Obsolete sources require obsolete compilers.
Francois LE COAT
2015-10-08 19:31:00 UTC
Permalink
Hi MiKRO,
Post by Miro Kropáček
Post by Francois LE COAT
I'm afraid if I make that changes, my Eureka 2.12 sources will only
be compatible with GNU/GCC 4, and not any other C compiler.
This is not true. On the contrary, most likely you'll discover hidden bugs in your code. And of course the source will stay compatible, as mentioned, the compiler is now more strict but that doesn't mean it doesn't adhere to the C standard.
So it boils down to the fact whether you're willing to change your code or you're stuck with 3.x forever. Obsolete sources require obsolete compilers.
Well, I remind you that I'm not supposed to be able to use GNU/GCC 4
to build my Eureka 2.12 program. The great advantage with GNU/GCC 3
is that it understands my sources, producing warnings that it
can tolerates. The other advantage is that it includes 16bits
libraries, that are mandatory to build my software, for compatibility
with PURE C and furthermore binary compatibility with the ATARI ST.

Should I abandon the opportunity to build an ATARI ST software to
sacrify to the modernity of GNU/GCC 4, because it's new. What
improvements does it bring to the m68k-atari-mint target ? I'm
suspecting there's more drawbacks than improvements with GNU/GCC 4.

Why doesn't it include 16bits libs, breaking backward compatibility ?

Thanks for your answer.

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Miro Kropáček
2015-10-09 06:55:46 UTC
Permalink
Post by Francois LE COAT
The great advantage with GNU/GCC 3
is that it understands my sources, producing warnings that it
can tolerates.
You're missing the point completely but whatever, your choice.
Post by Francois LE COAT
Why doesn't it include 16bits libs, breaking backward compatibility ?
This is not gcc maintainers' decision. The decision has been made by Vincent who has disabled it. Mintlib doesn't support -mshort for decades, the same goes for every lib in Sparemint RPM packages. As said, -mshort support is only for the FreeMiNT kernel who has to support -mshort by definition because the TOS API is 16-bit.

So again, and this is my last post on this topic, you're having obsolete sources and that's that.
Francois LE COAT
2015-10-09 15:25:27 UTC
Permalink
Hi MiKRO,
Post by Miro Kropáček
Post by Francois LE COAT
The great advantage with GNU/GCC 3
is that it understands my sources, producing warnings that it
can tolerates.
You're missing the point completely but whatever, your choice.
Post by Francois LE COAT
Why doesn't it include 16bits libs, breaking backward compatibility ?
This is not gcc maintainers' decision. The decision has been made by Vincent who has disabled it. Mintlib doesn't support -mshort for decades, the same goes for every lib in Sparemint RPM packages. As said, -mshort support is only for the FreeMiNT kernel who has to support -mshort by definition because the TOS API is 16-bit.
So again, and this is my last post on this topic, you're having obsolete sources and that's that.
I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
GNU/GCC 4 is not, because it is too restrictive with its syntax,
but furthermore it doesn't implement 16bits libraries required
for ATARI ST softwares.

I don't want to loose ATARI ST compatibility, so I don't use it.

Thanks for your answers. The OS maintainers are making very strange
decisions breaking the backward compatibility with the ATARI ST !
Most of them probably never developed on the ATARI ST hardware :-(

Best regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Michael Schwingen
2015-10-10 09:40:57 UTC
Permalink
Post by Francois LE COAT
I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
GNU/GCC 4 is not, because it is too restrictive with its syntax,
but furthermore it doesn't implement 16bits libraries required
for ATARI ST softwares.
Those are not *required*. Operating system calls do not need a special "int"
size from the compiler - they should use uint16_t/uint32_t in the C library
syscall implementation, which works just fine with any gcc version, and most
OS calls I remember actually use 32-bit ints, not 16 bit. After all, the 68k
CPU is (by architecture) a 32-bit machine.

Good, *portable* C code should work as well when compiled with 32-bit ints,
although it may run a bit slower and use more memory - how that is offset by
the better optimizations of the new compiler would need to be checked in
every case.
Post by Francois LE COAT
Thanks for your answers. The OS maintainers are making very strange
decisions breaking the backward compatibility with the ATARI ST !
Which OS maintainers? The last OS version that was maintained was TOS 4.08.

You are talking about one gcc package - you are free to take the source code
and compile your own version with 16-bit integer support. You are also free
to take the 16 bit libraries and adapt them to compile with gcc4 (or better
gcc5) - it can be done, I use gcc4 on 16-bit targets (avr) regularly.

If noone stepped up and did just that until now, it does not mean it is
impossible or someone decreed that 16-bit support should be dropped - just
that noone wanted it enough to invest their own time.

cu
Michael
Francois LE COAT
2015-10-10 11:07:14 UTC
Permalink
Hi,
Post by Michael Schwingen
Post by Francois LE COAT
I'll add that GNU/GCC 3 is suitable to build ATARI ST softwares.
GNU/GCC 4 is not, because it is too restrictive with its syntax,
but furthermore it doesn't implement 16bits libraries required
for ATARI ST softwares.
Those are not *required*. Operating system calls do not need a special "int"
size from the compiler - they should use uint16_t/uint32_t in the C library
syscall implementation, which works just fine with any gcc version, and most
OS calls I remember actually use 32-bit ints, not 16 bit. After all, the 68k
CPU is (by architecture) a 32-bit machine.
Good, *portable* C code should work as well when compiled with 32-bit ints,
although it may run a bit slower and use more memory - how that is offset by
the better optimizations of the new compiler would need to be checked in
every case.
Post by Francois LE COAT
Thanks for your answers. The OS maintainers are making very strange
decisions breaking the backward compatibility with the ATARI ST !
Which OS maintainers? The last OS version that was maintained was TOS 4.08.
You are talking about one gcc package - you are free to take the source code
and compile your own version with 16-bit integer support. You are also free
to take the 16 bit libraries and adapt them to compile with gcc4 (or better
gcc5) - it can be done, I use gcc4 on 16-bit targets (avr) regularly.
If noone stepped up and did just that until now, it does not mean it is
impossible or someone decreed that 16-bit support should be dropped - just
that noone wanted it enough to invest their own time.
Sorry, but we're not speaking of contemporary developments, but choices
of implementation done 30 years ago, when GNU foundation was just
created, and when there was no internet network. It's very easy to make
good choices of implementation (usage of integer type etc.) for
actual developers, taking into account about errors of the past
programmers. But if today's programmers are correctly developing in
C language, that's because developers like me made errors in the past.
The problem is that programs like Eureka 2.12 are now obsolete. It
represents a large part of softwares developed on ATARI ST hardware.

You would have been very clever if you had given me those advices
30 years ago. For the time being, there's nothing else I can do,
except being very sad my C sources are not supported anymore, alas.

Thanks for your answer.

Regards,
--
François LE COAT
Author of Eureka 2.12 (2D Graph Describer, 3D Modeller)
http://eureka.atari.org/
Loading...