#5631 closed Bugs (fixed)
nvcc's pre-processor breaks boost/lexical_cast.hpp
| Reported by: | Owned by: | Antony Polukhin | |
|---|---|---|---|
| Milestone: | Boost 1.48.0 | Component: | lexical_cast |
| Version: | Boost 1.42.0 | Severity: | Problem |
| Keywords: | nvcc lexical_cast pre-processor | Cc: |
Description
The nvcc compiler fails to compile a trivial file which just #includes the boost/lexical_cast.hpp header. g++ compiles the file correctly.
rjw57@spica:~$ cat foo.cpp #include <boost/lexical_cast.hpp> rjw57@spica:~$ cat foo.cu #include <boost/lexical_cast.hpp> rjw57@spica:~$ /usr/bin/g++-4.4 -c foo.cpp rjw57@spica:~$ /usr/bin/nvcc --compiler-bindir=/usr/bin/g++-4.4 -c foo.cu /usr/include/boost/lexical_cast.hpp(352): error: expected an expression 1 error detected in the compilation of "/tmp/tmpxft_00001520_00000000-4_foo.cpp1.ii".
The problem is with nvcc's pre-processor which can be observed by using -E and grepping for // == 1 in the output:
rjw57@spica:~$ /usr/bin/g++-4.4 -E -o foo-gcc.i foo.cpp rjw57@spica:~$ /usr/bin/nvcc --compiler-bindir=/usr/bin/g++-4.4 -E -o foo-nvcc.i foo.cu rjw57@spica:~$ ack-grep '\/\/ == 1' foo*.i foo-nvcc.i 73304: static const std::size_t value = std::numeric_limits<Source>::is_signed + std::numeric_limits<Source>::is_specialized + // == 1 std::numeric_limits<Source>::digits10 * 2;
The nvcc pre-processor does not strip comments from the input to the compiler and appears to be removing new-lines. The gcc compiler does strip the comments.
This is possibly a bug in nvcc but I suggest a workaround could be put into boost itself without too much pain. Also I suspect it'd take a standards lawyer to work out who is in the wrong here.
The troublesome line (number 345 in 1.4.2) is in the boost/lexical_cast.hpp file. This problem is still present in the SVN trunk.
...
template<class Source>
struct lcast_src_length_integral
{
#ifndef BOOST_NO_LIMITS_COMPILE_TIME_CONSTANTS
BOOST_STATIC_CONSTANT(std::size_t, value =
std::numeric_limits<Source>::is_signed +
std::numeric_limits<Source>::is_specialized + // == 1
std::numeric_limits<Source>::digits10 * 2
);
#else
BOOST_STATIC_CONSTANT(std::size_t, value = 156);
BOOST_STATIC_ASSERT(sizeof(Source) * CHAR_BIT <= 256);
#endif
};
...
A suggested workaround is removing the C++-style comment or replacing it with a C-style one?
Change History (4)
comment:1 by , 11 years ago
comment:2 by , 11 years ago
| Milestone: | To Be Determined → Boost 1.48.0 |
|---|---|
| Owner: | changed from to |
| Status: | new → assigned |
comment:3 by , 11 years ago
| Resolution: | → fixed |
|---|---|
| Status: | assigned → closed |
comment:4 by , 11 years ago
Great thanks for testing on nvcc!
Sorry for not responding sooner. If you find any new bugs in lexical_cast, report them and put my name in "Owned by:" field.
Does nvcc now fails on some other //? Does now lexical_cast.hpp compiles on nvcc?

My local friendly standards lawyer points out it is a problem with nvcc (comments should be stripped before processing macros apparently) but nonetheless, it seems that boost could defensively code around this issue.