Converting byte[] to string

M

Marius Cabas

Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String but the converted string is altered, so if I'm generating a new
wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);


// Here I generate a second wave file from the result but the generated file
contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to read a
binary file in a String object?
 
J

Jon Skeet [C# MVP]

Marius Cabas said:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String

That's a very bad idea. Strings are for text data. Wave files are
binary data. Just keep it in byte array format.
 
G

Guest

Marius,

You can use Convert.ToBase64String & Convert.FromBase64String to correctly
encode and decode binary data in a string.

However, I suspect this is not really what you want to do. It looks like the
problem may be in the semantics of the method in the assembly you are trying
to call. Does this really expect a string? And if so, perhaps there is some
documentation somewhere which specifies what format this string should be. As
Jon said in the previous post, strings are not used to store binary data.

If there is still a problem, perhaps you can post some details of the method
& assembly you are trying to call.

Cheers,
Chris.
 
I

Ignacio Machin \( .NET/ C# MVP \)

Hi,

Why are you converting a wave file that is a BINARY file to text ?

cheers,
 
M

Marius Cabas

because I have to pass the binary stream to an SSL socket function that
takes a String object as a parameter.

Ignacio Machin ( .NET/ C# MVP ) said:
Hi,

Why are you converting a wave file that is a BINARY file to text ?

cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation



Marius Cabas said:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String but the converted string is altered, so if I'm generating a new
wave file from that string, the continment is altered from it's original
state. Below is a code snipped I'm using:

// Here I read the wave file and I convert the result to a string
byte[] b = new byte[35000];
FileStream fs = File.OpenRead("test.wav");
int size = fs.Read(b, 0, b.Length);
string dataSample = Encoding.ASCII.GetString(b, 0, size);


// Here I generate a second wave file from the result but the generated file
contains noise and disturbances
FileStream f = File.Create("test2.wav");
f.Write(Encoding.ASCII.GetBytes(dataSample), 0, size);

I also tryied some other encoding types like UTF7, UTF8, Default and
ISO-8859-1 but the result is the same.
I'm using a function from an assembly which needs a String parameter as
input that should be a wave buffer.
Can somebody help me? What am I doing wrong? Is there another way to
read
a
binary file in a String object?
 
M

Marius Cabas

Yeah, I know this but I have to do it because I'm using an assembly wrote by
a third party. This assembly contains an SSL socket class that takes a
String object as a parameter. This String object keeps the data to send over
TCP/IP via SSL. I have no choice :(

Jon Skeet said:
Marius Cabas said:
Hi, I'm a beginner so don't shoot ;)
I'm reading a wave file into a byte[] and I'm trying to convert the result
to String

That's a very bad idea. Strings are for text data. Wave files are
binary data. Just keep it in byte array format.
 
J

Jon Skeet [C# MVP]

Marius Cabas said:
Yeah, I know this but I have to do it because I'm using an assembly wrote by
a third party. This assembly contains an SSL socket class that takes a
String object as a parameter. This String object keeps the data to send over
TCP/IP via SSL. I have no choice :(

Hmm... I would contact the third party and check this. SSL is designed
for streams really - there's no justifiable reason why you *should*
have to specify everything in terms of strings. It's just asking for
trouble.

Are you able to specify the encoding the SSL code will use?
 
M

Marius Cabas

Jon Skeet said:
Are you able to specify the encoding the SSL code will use?

No, I have no control. I can only connect to a remote host using a por
number and I can set the certificates. Afterwards, I can read and write data
from/to the socket.
 
J

Jon Skeet [C# MVP]

Marius Cabas said:
No, I have no control. I can only connect to a remote host using a por
number and I can set the certificates. Afterwards, I can read and write data
from/to the socket.

And you can only read/write data from/to the socket in string form?
What a terrible interface.

Basically, you won't be able to transfer binary data correctly unless
you can use something like Base64 encoding at both ends. If you don't
have control over the other end, you're stuffed.

Is there any way you can ditch this library and use a different one? It
sounds awful...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top