[...]Meanwhile, I was guessing
that his data is an array of 2's complement numbers being transmitted as
an array of 8-bit bytes. [...]
2's complement is a binary data format for signed numbers. Bytes (that
is, the C# "byte" type) are unsigned.
[...] So
when the OP says "hex" I interpret that as "binary data", since I think
there is a fair likelihood that that is what he means.
Since "hex" doesn't mean "binary data" (in fact just the opposite),
that's an incredibly odd conclusion to draw.
[...]
Sorry, but ASCII-Hex (or sometimes Hex-ASCII) is the term I have seen
applied to this type of representation. I didn't invent the term and so
feel no need to defend it, but it seemed to be in fairly wide use at
that time and place and nobody was claiming they didn't know what was
meant by it. Apparently you have never encountered it.
No, I haven't. Nor have the major Internet search engines. It's simply
not a standard term for anything. I don't know where you came across
it, but you can't expect to use terms that no one else is using and
still be understood.
More importantly, introducing "ASCII" into a conversation where there's
no indication that ASCII plays any part whatsoever just confuses issues.
"Hex" is "hex", independent of encoding. Since it's text, at some
point encoding becomes an important question, but it's not part of the
textual representation of numbers.
[...]
Whereas to me hexadecimal, octal and binary are all numeric formats.
(And are all essentially binary in the sense I explained above.) And
this is where I get off when it comes to arguing about what "hex" means
- it's pointless.
There's no need for argument. The references are quite clear on the point.
For example, here's the Wikipedia page:
http://en.wikipedia.org/wiki/Hexadecimal
It goes on at length on various topics regarding hexadecimal
representation of numbers, and not once does the question of the actual
computer's internal representation of numbers come up.
You are right that it's pointless to argue about it. You are wrong to
suggest that hexadecimal is anything other than a way in human languages
to write numbers in a specific way. It's certainly not a synonym for
binary data (arranged in bytes or otherwise).
[...]
Peter, sometimes it seems like half of ALL of your posts address that
very issue. It's a difference in approach partly. You seem to be trying
to bring him up to your standards of discourse so that in the end his
problem will be solved plus the world will be a better and more precise
place - and that's fine.
You would do well to terminate your reasoning at the point of what you
can actually _observe_. You clearly have no idea what my motivations
are, and are simply compounding your errors by hypothesizing.
It's true that I often ask questioners to improve the precision of their
questions. But it's only because people who ask questions are often so
poor at forming a useful question, along with the fact that an imprecise
question has no answer to give.
You can guess at what the OP means until the cows come home, the fact is
you're just wasting your time without all the facts. You might stumble
across the right answer once in awhile, but if you're answering any
significant volume of questions that way, you're going to be wrong more
often than not when the question is so poorly formed. It's better to
just nail down the question first, than to grope around for what you
think _might_ be an answer.
Suit yourself, but don't go casting aspersions at those who choose a
more efficient approach.
Pete