Dealing with very large binary files

  • Thread starter sebastian.harko
  • Start date
S

sebastian.harko

Helllo,

What's the general accepted strategy for dealing with very large
binary files in C# ?
I have to do a program that reads some "multi frame bitmap " files
which can reach up to one hundred megs so I need to know how to
optimize reading a file..

Best regards,
Seb
 
P

Peter Duniho

What's the general accepted strategy for dealing with very large
binary files in C# ?

Very large binary files? Read a little bit at a time.
I have to do a program that reads some "multi frame bitmap " files
which can reach up to one hundred megs so I need to know how to
optimize reading a file..

Oh. I thought you were asking about "very large binary files". These
days, 100MB isn't really all that big. :)

That said, you are likely to find that the most important issue is to
make sure you read enough data at one time. You can use a
BufferedStream to ensure this, but my experience has been that even
FileStream has some built-in buffering (caching at the OS level) that
results in pretty good performance anyway.

Do you have a specific issue that is coming up in which performance is
not satisfactory? Generally, it's better to not waste time optimizing
until you know what performance problem you're trying to solve. Until
you have an actual performance problem, you can't answer that.

Pete
 
J

Jonathan Wood

It really depends on what you will do with those files. But you might check
out memory-mapped files.
 
N

Nicholas Paldino [.NET/C# MVP]

Seb,

Generally, for large amounts of data like this, you would process
everything in chunks, processing only what you have to, and then discarding
what you don't need when you move onto the next chunk.

You mentioned that you were dealing with files, so obviously, using the
FileStream class, reading in bytes as needed is essential (as opposed to
reading all the bytes in at one time).

Can you process one frame, accumilating the data you need and then
discard it? This would probably be your best bet.

Of course, with more information about how you have to process the
files, the easier it would be to give advice on what you are trying to do.
 
I

Ignacio Machin \( .NET/ C# MVP \)

HI,

Helllo,

What's the general accepted strategy for dealing with very large
binary files in C# ?

Depend of your escenario and of the kind of processing you need to do.
I have to do a program that reads some "multi frame bitmap " files
which can reach up to one hundred megs so I need to know how to
optimize reading a file..

And do what with them? As I said depending of your intended action will be
the best way of handling it
 
M

Momo Vuk

Hello,

If we have binary file (about 100 MB), how to read this file in moment? I want to read/write for this file. Every 80 bytes is one record(need conversion to string, datetime, number, etc).

I'm finding for the best solution in C# for opening the file (editor with grid), and handling (changes of data, deleting chunks, etc).

For example:

A1 B2 CC DD FF FF FF FF 01 23 45 FC C1.... F8 - 80 bytes
Every 80 bytes should be shown as one row into the grid.

How to implement the best solution?
what do you propose?
Could you send me an example how to do it?

Thanks in advance!
Helllo,

What's the general accepted strategy for dealing with very large
binary files in C# ?
I have to do a program that reads some "multi frame bitmap " files
which can reach up to one hundred megs so I need to know how to
optimize reading a file..

Best regards,
Seb
Very large binary files? Read a little bit at a time.


Oh. I thought you were asking about "very large binary files". These
days, 100MB isn't really all that big. :)

That said, you are likely to find that the most important issue is to
make sure you read enough data at one time. You can use a
BufferedStream to ensure this, but my experience has been that even
FileStream has some built-in buffering (caching at the OS level) that
results in pretty good performance anyway.

Do you have a specific issue that is coming up in which performance is
not satisfactory? Generally, it's better to not waste time optimizing
until you know what performance problem you're trying to solve. Until
you have an actual performance problem, you can't answer that.

Pete
On Monday, August 06, 2007 3:22 PM Nicholas Paldino [.NET/C# MVP] wrote:
Seb,

Generally, for large amounts of data like this, you would process
everything in chunks, processing only what you have to, and then discarding
what you don't need when you move onto the next chunk.

You mentioned that you were dealing with files, so obviously, using the
FileStream class, reading in bytes as needed is essential (as opposed to
reading all the bytes in at one time).

Can you process one frame, accumilating the data you need and then
discard it? This would probably be your best bet.

Of course, with more information about how you have to process the
files, the easier it would be to give advice on what you are trying to do.


--
- Nicholas Paldino [.NET/C# MVP]
- (e-mail address removed)

news:[email protected]...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top