J
J.B. Moreno
Arne Vajhøj said:
Either a distributed system or frequent updates -- otherwise what does
it matter if they have to download X amount of data, unzip it it into Y
number of files or 1 file?
The teams producing different parts of the app becomes dependent
on each other.
No more so than with static linking -- it adds an additional step, but
everything else is otherwise unchanged.
MSIs, ILMerge, VISE -- lots of tools to turn the deployment into a
single application, there's really no difference between having a
deployment application and having an application XCopy application
deployment.
The difference comes in when updating or extending it -- the initial
download is the same.
Those terms are not that well defined.
But if you think:
large is > 1 million LOC
medium 100000-1 million LOC
then you are not that far from me.
So maybe 5-50 meg for a medium and, 50+ for a large.
On a fast connection, you don't even need to break it up into smaller
pieces for frequent updates. On a slow connection, you better hope you
don't have to do frequent updates because even portion is going to be
slow...
I am talking development process - not operations.
Then you're mixing things up -- if you're not actually using the
dynamic loading (by which I mean switching to a newer version of the
code without relinking on the users machine) then it doesn't gain you
anything.