I've been working on error messages in Leaf, these use a lot of macros to reduce code size and keep speed up. I'm not too happy about them. I even started writing an article on how to minimize them, but realized I don't know how. Do any C++ gurus have some ideas?
The Macro
One macro invocation to test for a user-code error looks like below (this checks the condition that an index into a tuple is within range).
CHECK_RETURN( ctx, ex, index < tup->sub.size(), "index-out-of-range" );
The macro looks like this:
#define CHECK_RETURN( ctx_, stmt_, cond_, reason_, ... ) \
if( !(cond_) ) { (ctx_).mark_failed(*(stmt_)).error( *(stmt_), (reason_),\
{ LOGGER_COND(cond_) }, { __VA_ARGS__ } ); \
return (stmt_); }
Where LOGGER_COND
adds more info (assume S__LINE__
is the string version of __LINE__
)
#define LOGGER_COND(cond_) util::logger::item_debug_loc( __FILE__ "@" S__LINE__ ), \
util::logger::item_debug_cond( #cond_ )
The Reason
This macro fulfills a few requirements:
- It reduces redundant code (this type of macro is used hundreds of times in the source code)
- The costly bit, of creating the items for the logger, is not done unless the condition fails
- It reports common information to the logger
- It marks the compilation failed in a consistent manner
But Yuck!
But I just don't like this. I'm already keeping the macro short, but somehow I feel as though it could get shorted. The need for the deferred evaluation, and ultimate return
statement prevent me from writing a function that does the same thing -- unless, and this is where I'm hoping, there is a C++14/17 feature that somehow helps me.
Notes
The ...
/__VA__ARGS__
part in the macro is for places where I pass extra information:
CHECK_RETURN( ctx, field, msym, "unknown-field", util::logger::item_symbol(field->field) );
There are several redundant variants of the macro, some throw, others take slightly different parameters. I'd like a cleaner solution.
Top comments (7)
If you need the LINE (& FILE) you will have to use some top-level macro (or mention those markers explicitly on each call. Sadly, no escape from that.
In most compilers you can force inlining, and in modern C++ you can use if constrexpr to make sure that dead code is eliminated. Hence you can put the if in a function without loss.
The varg-args is a problem (and error prone, and ugly). Maybe you should think differently, like constructing a different objects for each combination of info you want to pass.
The condition is not a const-expr but a runtime value.
I'm not so concerned about the VARARGS, and I basically can get rid of it now with {}'d initializer lists. The function takes a std::vector of items. It's just a bit cleaner in the macro without the braces, but mainly that was done since the code was first written before I had the option.
Variants of the functions per object type is no an option, it would lead to too many variants. These items are really extra information passed to the logging, varying on each call site.
Within the macro body, you use
With these premises, I don't see how you could win anything unless you can really redo your exception handling approach. You are at a local optimum.
Personally I would not use var-args and a return statement within the macro body, but I have no alternative for FILE/LINE. That alone forces the use of a macro.
Did you consider using exceptions instead of codes? But you would still need a macro to supply the FILE/LINE.
I might indeed be near a local optimum. THe requirement ofr FILE/LINE, and lazy evaluation are forcing my hand.
I'm converting some of my code to use return values here instead of exceptions, since the exceptions are kind of wrong. I'm basing this on th enumber of places that need to handle the exceptions. The code has to handle these returns for other reasons anyway (failed is just one status of many).
I guess I live with it for now. :(
The first thing I usually do is try to directly convert the macro to a proper function without changing the name and then test. If there's a nested macro in there, convert it (if needed) and then test. This is so you won't have to deal with the calling functions until you're done with the conversion.
When you're done you can just rename the function and all the callers to whatever you want it to be.
Regarding the speed and code size, you can try to inline the new functions to see how much difference it makes.
I've pushed a lot into functions already. I need to get the calling location though, so I have to get access to LINE and FILE somehow. That, along with the other boilerplate I don't really want to repeat.
The conditional expansion of the logged values is an essential must as well. The rules on eliminating expressions are simply not good enough that I'll avoid the expensive creation (the optimizer just won't know there are no side-effects in the called function unfortunately).
Use inline functions instead of macros: Inline functions are a feature of C++ that allows you to write functions that are expanded in place at the point of call, rather than being called through a function pointer. This can be a good alternative to macros because inline functions are more type-safe and less error-prone than macros, and they are also easier to debug and maintain.
Use constexpr functions: Constexpr functions are a feature of C++11 that allows you to write functions that can be evaluated at compile-time, if the arguments are known at compile-time. This can be a good alternative to macros because constexpr functions are more type-safe and less error-prone than macros, and they can also be more efficient because they are evaluated at compile-time rather than runtime.
Use template functions: Template functions are a feature of C++ that allows you to write functions that can be specialized for different types and values at compile-time. This can be a good alternative to macros because template functions are more type-safe and less error-prone than macros, and they can also be more efficient because they are specialized at compile-time rather than runtime.