Precompiled header usage

A

Andrew Ward

Hi All,
I was wondering if it is possible to use precompiled headers without having
to include a <stdafx.h> or whatever in every source file.
My problem is that I have a project that makes heavy usage of a large third
party library which means after pre-processing each translation unit is
around 1meg in size. There are reasons though why I cannot use the standard
approach to precompiled headers.

What I would like to do is somthing like this:
1) create a header that includes all headers from third party library
2) compile this into a .pch file
3) have VS use this .pch file while compiling any source file it encounters

Is this possible?

Andy
 
T

Tanveer Gani [MSFT]

--------------------
From: "Andrew Ward" <[email protected]>
References: <[email protected]>
Subject: Re: Precompiled header usage
Date: Wed, 10 Sep 2003 17:26:22 +1200
Lines: 65
X-Priority: 3
X-MSMail-Priority: Normal
X-Newsreader: Microsoft Outlook Express 6.00.2800.1106
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2800.1106
Message-ID: <[email protected]>
Newsgroups: microsoft.public.dotnet.languages.vc
NNTP-Posting-Host: 210-54-80-239.dialup.xtra.co.nz 210.54.80.239
Path: cpmsftngxa06.phx.gbl!TK2MSFTNGP08.phx.gbl!TK2MSFTNGP10.phx.gbl
Xref: cpmsftngxa06.phx.gbl microsoft.public.dotnet.languages.vc:28151
X-Tomcat-NG: microsoft.public.dotnet.languages.vc

Hi Brandon,
The problem I am facing is that I cannot add #include "all.h" or whatever to
each source file in the project. About half of the source that is compiled
for the project is c++ generated from XML specifications, this source is all
generated automatically in the build process.

An approach like the following would be great for my type of build
environment:

Each time the compiler encounters an #include <xyz.h> directive it checks a
cache to see if it has a precompiled version of the header 'xyz.pch' that is
up to date, if one is found then it is used, otherwise a new precompiled
header is created 'xyz.pch' and put in the cache.

It seems to me that this approach would be more efficient as well, because
each file being compiled would only include the precompiled data that it
needs, instead of the norm of loading a huge precompiled header for every
source file, whether the file needs all the data or not.

Is anything of this sort possible?

Andy,

You seem to have misunderstood the VC++ PCH mechanism. The PCH is not a
more compact form of the header. It's really a dump of the compiler's
symbol table at the #include point in a source file. When compiling a file
with the same #include, the compiler can be made to load the PCH (really
the symbol table) the assumption being that the internal state is now the
same as if the #include had been processed at that point.

You can make the compiler use a different PCH for each compiland, but the
obvious advantage is when several compilands share a certain set of files.
Then a PCH gets built once and re-used multiple times in a build. Building
a PCH per file will save you time only if you're building the same file
over and over, but PCHs are typically very large running into several MB
for windows apps; writing a PCH out is expensive but loading it isn't since
it's just mapped into memory and paged in as needed. Hence the best
strategy to improve build times is to gather all headers into an "all.h"
header and build a PCH for this and use it in all the compilands. You'll
have to make changes to your codegen tool for this, but I'm sure you'll be
amazed at the performance gain.
 
A

Andrew Ward

Tanveer Gani said:
--------------------


Andy,

You seem to have misunderstood the VC++ PCH mechanism. The PCH is not a
more compact form of the header. It's really a dump of the compiler's
symbol table at the #include point in a source file. When compiling a file
with the same #include, the compiler can be made to load the PCH (really
the symbol table) the assumption being that the internal state is now the
same as if the #include had been processed at that point.

You can make the compiler use a different PCH for each compiland, but the
obvious advantage is when several compilands share a certain set of files.
Then a PCH gets built once and re-used multiple times in a build. Building
a PCH per file will save you time only if you're building the same file
over and over, but PCHs are typically very large running into several MB
for windows apps; writing a PCH out is expensive but loading it isn't since
it's just mapped into memory and paged in as needed. Hence the best
strategy to improve build times is to gather all headers into an "all.h"
header and build a PCH for this and use it in all the compilands. You'll
have to make changes to your codegen tool for this, but I'm sure you'll be
amazed at the performance gain.

--
Tanveer Gani, Microsoft Visual C++ Team
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Hi Tanveer, thanks for your response. There is at least one compiler that
implements it's precompiled-header support in the way I described (gcc) ,
that is why I asked.
The problem with your "all.h" approach is that my software has to be
compiled on many platforms, using different compilers. Some of which have no
precompiled-header support. This means if I were to use the "all.h" method
the compile times for other compilers that have no pch support would
skyrocket.
I believe that something such as pch support should not govern the way you
write your code, the fact the MS' pch support requires you to write your
code in a non efficient way is bad design.


Andy
 
B

Brandon Bray [MSFT]

Andrew said:
Hi Tanveer, thanks for your response. There is at least one compiler that
implements it's precompiled-header support in the way I described (gcc) ,
that is why I asked.

Hi Andy,
It would be helpful if you were able to provide more detail on what GCC
feature you are using (or provide a link to some place that has
documentation without a license). To my knowledge (from discussions with
people who use and work on GCC), pre-compiled headers are not yet a feature
on GCC. I could be wrong, and would enjoy learning more about it.
The problem with your "all.h" approach is that my software has to be
compiled on many platforms, using different compilers. Some of which have
no precompiled-header support. This means if I were to use the "all.h"
method the compile times for other compilers that have no pch support
would skyrocket.

This isn't necessarily true. If this approach were used under an #ifdef
dependent on _MSC_VER, you could write code that compiles efficiently on
several platforms. There are very few cross-platform libraries that do not
make adjustments for particular compilers.
I believe that something such as pch support should not govern the way you
write your code, the fact the MS' pch support requires you to write your
code in a non efficient way is bad design.

I don't particularly think what Tanveer suggested was a bad design. In some
regards, it prevents excessive refactoring that often gets developers in to
trouble when different definitions are introduced for the same type. In any
case, I hope you can try writing something dependent on _MSC_VER that allows
you to succeed at your task.

Cheerio!
 
D

David Olsen

Brandon said:
I don't particularly think what Tanveer suggested was a bad design. In some
regards, it prevents excessive refactoring that often gets developers in to
trouble when different definitions are introduced for the same type. In any
case, I hope you can try writing something dependent on _MSC_VER that allows
you to succeed at your task.

I think Microsoft's precompiled headers is a bad design. In my opinion,
a header file should #include exactly what it needs (no more, no less)
so that it can be compiled by itself. This puts all the #include
dependencies exactly where they should be. Precompiled headers
encourages a design where header files do not include anything (or very
little), but instead rely on certain things already having been included
earlier in the compilation.

It also bugs me that precompiled headers is not just an optimization,
but can change the semantics of the program being compiled. For
example, if a .cpp file starts with:

#define PRECOMPILED_HEADERS_OFF
#include "stdafx.h"

Then PRECOMPILED_HEADERS_OFF will be defined if the project doesn't use
precompiled headers, but won't be defined if precompiled headers are
turned on.

I understand that implementing precompiled headers correctly and
efficiently without the restrictions Microsoft imposes is an extremely
hard problem. (In a previous job I actually looked into doing it.) So
I don't begrudge Microsoft for not doing it. But because they encourage
bad design I don't use precompiled headers in any of my projects until
the compilation times become completely unbearable. (Fortunately, I
almost never have to include windows.h.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top