J
Jon Skeet [C# MVP]
Peter Olcott said:It is not taking very much effort to completely eliminate all factors
significant and otherwise. Certainly passing large aggregate data types by value
is significant. I want to make sure that I have a 100% understanding on how to
always avoid passing large aggregate data types by value. Boxing and Unboxing
can be comparable to passing a large aggregate data type by value.
Well, they're not really similar and boxing/unboxing is relatively rare
when generics are available. Even when boxing/unboxing *is* involved,
as shown in the benchmark you were worried about, the cost of the
boxing was negligible compared with the copying involved in the rest of
the benchmark (so neatly sidestepped by the "updated" C++ version which
made the comparison completely irrelevant).
Imagine passing an array with millions of elements by value, instead of by
reference. Imagine further still that the only need of this array was to do a
binary search. Now we have a function that is 100,000-fold slower than necessary
simply because the difference between boxing and unboxing was not completely
understood.
If I do:
int[] x = new int[100000];
DoSomething (x);
how many bytes do you think are copied?
You seem to imagine that arrays are value types. Arrays are reference
types, so you could never pass the contents by value. That's why I've
said repeatedly that you *really* need to know more about value types
and reference types (and the fact that you very, very rarely *get* big
value types) before going much further. You have latched on to one
particular aspect of .NET and want to go very deeply into it without
getting a reasonable understanding of the rest of it. Getting the
basics right across the board will help you work out where it *is*
worth getting deeper understanding, and help you in achieving that
understanding too.