Subject: gcc default debugging symobls
To: None <current-users@netbsd.org>
From: Jan Schaumann <jschauma@netmeister.org>
List: current-users
Date: 04/05/2004 14:16:28
--yudcn1FV7Hsu/q59
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable

Hi,

Under i386, it appears that gcc if passed '-g' generates the wrong
debugging symbols per default:

#include <stdio.h>

struct astruct {
  unsigned char c;
  double d;
};

int main() {
           =20
        struct astruct x;

        x.c =3D 'a';
        x.d =3D 0.184;
        printf("%f\n", x.d);
        return 0;
}

When compiled with '-g', running it through gdb it will correctly print
0.184000, but if you break and inspect x, it will give:

(gdb) p x
$1 =3D {c =3D 97 'a', d =3D 5.2867121265460283e-315}
(gdb)=20

Now using '-gstabs' or '-gstabs+' or '-gstabs3', it will behave
correctly.  So... what am I doing wrong, or why is gcc not creating the
correct debugging symbols per default?

-Jan

--=20
Probability factor of one to one. We have normality. I repeat, we have=20
normality. Anything you still can't cope with is therefore your own lookout.

--yudcn1FV7Hsu/q59
Content-Type: application/pgp-signature
Content-Disposition: inline

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.3 (NetBSD)

iD8DBQFAcaJ8fFtkr68iakwRAjC3AJ4gQQ/vSG1qSuJddnlgigk+7S9c8ACg5ZRv
1YR2Q/RQ6a/boo7PZuyFwrk=
=AwdF
-----END PGP SIGNATURE-----

--yudcn1FV7Hsu/q59--