Math constant definitions

Hello all, I was going through the cl_platform.h header(s) on my system, and I was surprised to find that the definition of PI (CL_M_PI and CL_M_PI_F) were apparently in disagreement:

#define  CL_M_PI            3.141592653589793115998
#define  CL_M_PI_F          3.14159274101257f

Moreover, these values are different from the ones that are found e.g. in the GNU C standard library headers, where for example we have

# define M_PI		3.14159265358979323846

It is my understanding (and the findings e.g. here seem to agree) that at the precision of double and floats the definitions are essentially equivalent. However, I believe it is a little confusing to see those values, especially when someone is familiar with the ones typically used instead.

So my questions are:

  1. why the choice?
  2. would it be possible to amend the header files with either the ‘correct’ values or a comment on why they are not used?

As far as I can tell, those decimal representations map to the same double-precision floating point value. Both are equally correct.

The decimal representation of CL_M_PI was probably obtained by computing the double-precision floating point value that is closest to the true value of pi and then transformed it back to decimal. On the other hand, glibc’s M_PI decimal representation appears to come directly from the decimal representation of pi. Both will translate to the same double-precision floating point value, so it doesn’t matter.

Does it make sense?

Thank you for your reply. Indeed, at the precision at which IEEE floats and doubles operate the decimal representation given and the one closer to the actual value of pi in arbitrary precision result in the same exact value. And indeed it doesn’t matter, since the actual bit value compiled in will be exactly the same.

However, I do believe that using the binary to decimal transformation of the approximation rather than the truncation of the decimal representation can be somewhat confusing when reading the headers. And since for the computers it makes absolutely no difference, I don’t actually understand why the computer representation was chosen over the one more familiar to humans.