In my SAX xml parsing callback (XCode 4, LLVM), I am doing a lot of calls to this type of code:
static const char* kFoo = \"Bar\";
void SaxCallBack(char* sa
If by "LLVM" you mean clang, then yes, you can count on clang -O
to optimize the strlen
away. Here is what the code for your function looks like:
_SaxCallBack:
Leh_func_begin1:
pushq %rbp
Ltmp0:
movq %rsp, %rbp
Ltmp1:
leaq L_.str1(%rip), %rsi
movl $3, %edx
callq _strncmp
...
I changed the strcmp
into strncmp
, but the third argument has indeed been replaced by the immediate $3
.
Note that gcc 4.2.1 -O3 does not optimize this strlen
call, and that you can only expect it to work in the precise conditions of your question (especially, the string and the call to strlen
must be in the same file).
Don't write things like:
static const char* kFoo = "Bar";
You've created a variable named kFoo
that points to constant data. The compiler might be able to detect that this variable does not change and optimize it out, but if not, you've bloated your program's data segment.
Also don't write things like:
static const char *const kFoo = "Bar";
Now your variable kFoo
is const
-qualified and non-modifiable, but if it's used in position independent code (shared libraries etc.), the contents will still vary at runtime and thus it will add startup and memory cost to your program. Instead, use:
static const char kFoo[] = "Bar";
or even:
#define kFoo "Bar"
Follow-up question:
Have you tested the following ?
static std::string const kFoo = "BAR";
void SaxCallBack(char* sax_string,.....)
{
if ( sax_string == kFoo)
{
}
}
It's a net win in readability, but I have no idea about the performance cost.
As an alternative, if you must dispatch by yourself, I have found that using a state-machine like approach (with stack) is much better readability-wise, and might also win performance-wise (instead of having a large number of tags to switch on you only have the tags that can be met right now).
In general you can't count on it. However, you could use 'sizeof' and apply it to a string literal. Of course, this mean that you can't define 'kFoo' the way it originally was defined.
The following should work on all compilers and on all optimization levels.
#define kFoo "..."
... strcmp(... sizeof(kFoo))