Subject: gcc default debugging symobls
To: None <>
From: Jan Schaumann <>
List: current-users
Date: 04/05/2004 14:16:28
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable


Under i386, it appears that gcc if passed '-g' generates the wrong
debugging symbols per default:

#include <stdio.h>

struct astruct {
  unsigned char c;
  double d;

int main() {
        struct astruct x;

        x.c =3D 'a';
        x.d =3D 0.184;
        printf("%f\n", x.d);
        return 0;

When compiled with '-g', running it through gdb it will correctly print
0.184000, but if you break and inspect x, it will give:

(gdb) p x
$1 =3D {c =3D 97 'a', d =3D 5.2867121265460283e-315}

Now using '-gstabs' or '-gstabs+' or '-gstabs3', it will behave
correctly.  So... what am I doing wrong, or why is gcc not creating the
correct debugging symbols per default?


Probability factor of one to one. We have normality. I repeat, we have=20
normality. Anything you still can't cope with is therefore your own lookout.

Content-Type: application/pgp-signature
Content-Disposition: inline

Version: GnuPG v1.2.3 (NetBSD)