On Tue, Mar 20, 2018 at 03:54:36AM +0200, Michael S. Tsirkin wrote: > QEMU coding style at the moment asks for all non-system > include files to be used with #include "foo.h". > However this rule actually does not make sense and > creates issues for when the included file is generated. > > In C, include "file" means look in current directory, > then on include search path. Current directory here > means the source file directory. > By comparison include <file> means look on include search path. > > As generated files are not in the search directory (unless the build > directory happens to match the source directory), it does not make sense > to include them with "" - doing so is merely more work for preprocessor > and a source or errors if a stale file happens to exist in the source > directory. > > This changes include directives for all generated files, across the > tree. The idea is to avoid sending a huge amount of email. But when > merging, the changes will be split with one commit per file, e.g. for > ease of bisect in case of build failures, and to ease merging. > > Note that should some generated files be missed by this tree-wide > refactoring, it isn't a big deal - this merely maintains the status quo, > and this can be addressed by a separate patch on top. > > Signed-off-by: Michael S. Tsirkin <m...@redhat.com>
For the record, the stated advantage is that one can have a header file that happens to match the system header. To put it bluntly that does not work as designed. For example, if a system header foo.h somewhere has #include <trace.h> then the compiler will happily pull in our own version (since that is in the -I path) and completely ignore the system one, breaking things in the process. When does it make sense to use include ""? When the header is a directory-specific one, located with the source. This approach would both be enforced by the compiler and help people know where to find the header. -- MST