Questions about writing to flash drives

  • Thread starter Charlie Hoffpauir
  • Start date
C

Charlie Hoffpauir

Here's the situation... I have 1.5 GB of files, PDFs, Html, text, etc
to write to multiple flash drives. I have as many as 5 USB ports
available on my computer. I'll probably be using 2 GB USB 2 flash
drives for this. Each copy from my hard drive to a test USB flash
drive takes almost an hour. I suppose I could set up 5 copies (start 1
to the first drive, then start the second to the second drive, etc)
and the hard drive would send the data fast enough to keep up with all
5 copies, but I was wondering if there is a better way to do this? (my
guesstimate is about 5 hours to copy 25 drives using this method) Is
there a way to make one file write to 5 different locations? My
transfers will probably be done by using a batch file to do a Robocopy
copy of my master set, unless I can do a EASUS partition copy faster,
or find some other way. I'll only need to do this thing once a year,
and for about 25 USB drives, so it's not like a mass production
situation.

Another point, perhaps not pertinent, is that "some" of the files
would be the same every year, but probably "most" of the files would
be new or revised from the previous year. So this means using Robocopy
I could "pre-record" the files that stay the same year-to-year, to
reduce the copy time when I produce the final 25 "finished" copies.
 
C

Charlie Hoffpauir

Here's the situation... I have 1.5 GB of files, PDFs, Html, text, etc
to write to multiple flash drives. I have as many as 5 USB ports
available on my computer. I'll probably be using 2 GB USB 2 flash
drives for this. Each copy from my hard drive to a test USB flash
drive takes almost an hour. I suppose I could set up 5 copies (start 1
to the first drive, then start the second to the second drive, etc)
and the hard drive would send the data fast enough to keep up with all
5 copies, but I was wondering if there is a better way to do this? (my
guesstimate is about 5 hours to copy 25 drives using this method) Is
there a way to make one file write to 5 different locations? My
transfers will probably be done by using a batch file to do a Robocopy
copy of my master set, unless I can do a EASUS partition copy faster,
or find some other way. I'll only need to do this thing once a year,
and for about 25 USB drives, so it's not like a mass production
situation.

Another point, perhaps not pertinent, is that "some" of the files
would be the same every year, but probably "most" of the files would
be new or revised from the previous year. So this means using Robocopy
I could "pre-record" the files that stay the same year-to-year, to
reduce the copy time when I produce the final 25 "finished" copies.

More questions....

I had my files stored on a USB3 30 GB Flash drive formatted FAT32, and
Windows properties says used space is 1.43 GB and 28.5 GB free. So, to
test how long it would take to record these files to a USB 2, 2GB
drive, I started the file transfer. After an hour I find that the
transfer failed because there wasn't enough space on the 2 GB drive.
In checking it. I find it's formated FAT, with 1.92 GB used and 0
bytes free. Is it because it's formatted FAT rather than FAT32? I have
another drive I'm getting ready to run the test on, it's 8 GB,
formatted exFAT. If I reformat the 2 GB drive to FAT32 or exFAT,
should I then expect the files to fit?

One more point, because many of the files are small html files, there
are a huge number of files altogether. For example, the largest folder
has 50,000 files in it.
 
P

Paul

Charlie said:
Here's the situation... I have 1.5 GB of files, PDFs, Html, text, etc
to write to multiple flash drives. I have as many as 5 USB ports
available on my computer. I'll probably be using 2 GB USB 2 flash
drives for this. Each copy from my hard drive to a test USB flash
drive takes almost an hour. I suppose I could set up 5 copies (start 1
to the first drive, then start the second to the second drive, etc)
and the hard drive would send the data fast enough to keep up with all
5 copies, but I was wondering if there is a better way to do this? (my
guesstimate is about 5 hours to copy 25 drives using this method) Is
there a way to make one file write to 5 different locations? My
transfers will probably be done by using a batch file to do a Robocopy
copy of my master set, unless I can do a EASUS partition copy faster,
or find some other way. I'll only need to do this thing once a year,
and for about 25 USB drives, so it's not like a mass production
situation.

Another point, perhaps not pertinent, is that "some" of the files
would be the same every year, but probably "most" of the files would
be new or revised from the previous year. So this means using Robocopy
I could "pre-record" the files that stay the same year-to-year, to
reduce the copy time when I produce the final 25 "finished" copies.

Is the average file size particularly small ?

Maybe the transfer is slowed down a bit, by doing
small writes, which the USB key doesn't like.

As it is, your transfer rate is abysmally slow!

One way to fill a storage device, is to "clone" it
from another storage device. Say for example,
I define a partition just large enough to hold all the
files, with no slack space. Then, I use "disk dump"
utility, to image the sectors in question. When
they're transferred to another USB stick, as a dd
transfer, the new stick then has an exact copy.

The difference between that and robocopy, is the write
operations can be done in larger blocks. The parameters
on the end are adjusted to copy exactly the clone file size.

dd if=C:\clonesource.dd of=\\?\Device\Harddisk2\Partition0 bs=1048576 count=1000

( http://www.chrysocome.net/dd
http://www.chrysocome.net/downloads/dd-0.6beta3.zip )

It would be attempting a low level write of 1MB chunks
at a time. Which might cause whole flash pages to be
erased and rewritten.

Such an approach doesn't solve the head thrashing problem
all that much. But what you have working for you, is the
potential for the system file cache to already have the
needed data in RAM. So at some point, the disk light
may stop flashing, because your computer RAM may have
a copy. I would recommend 4GB of RAM in the computer, if
you have a 32 bit OS, and see if the 3.2GB free memory is
sufficient to keep 2GB of files cached while you're making
all these copies.

*******

If you want a quick experiment, use any ZIP program you
own, select "store" mode or no compression mode. That
will gather up the 1.5GB of loose files and make a single
1.5GB file from it. Then, time how long it takes
to write that big single file to your USB stick. Is that
method significantly faster ? If so, then either distribute
the materials as a ZIP, or, go the extra distance of
using a cloning approach.

I should warn you, that free software for cloning, does
not necessarily like USB keys as either a source or a
destination. I did not pick "dd.exe" for this job out
of idle speculation. If you try to do it some "easy"
way, you may get stopped dead in your tracks. The "dd.exe"
method sucks, but it has the advantage that I know it
will write a USB key.

Doing it with a big ZIP is easy, but your end users
might not like ZIP no matter how prettily it is packaged.
You could have an autorun.inf which runs a ZIP self
extractor, but I'm sure that would scare the socks
off the average recipient. So don't do that... :)
Files should just lay there passively, if you want
to stay in the good graces of your recipients. The
USB key should not be seen to be "attacking" the computer.

Do your benchmarks first, and then get back to me if
you want some more help with dd. I didn't write a
complete recipe for how to use dd, and it's a bit
more complicated than using the ZIP method would be.

Paul
 
P

Paul

Charlie said:
More questions....

I had my files stored on a USB3 30 GB Flash drive formatted FAT32, and
Windows properties says used space is 1.43 GB and 28.5 GB free. So, to
test how long it would take to record these files to a USB 2, 2GB
drive, I started the file transfer. After an hour I find that the
transfer failed because there wasn't enough space on the 2 GB drive.
In checking it. I find it's formated FAT, with 1.92 GB used and 0
bytes free. Is it because it's formatted FAT rather than FAT32? I have
another drive I'm getting ready to run the test on, it's 8 GB,
formatted exFAT. If I reformat the 2 GB drive to FAT32 or exFAT,
should I then expect the files to fit?

One more point, because many of the files are small html files, there
are a huge number of files altogether. For example, the largest folder
has 50,000 files in it.

Well, there is your problem right there. A huge number of files,
requiring the poor USB flash drive to do a lot of work for each
one. Think of the pounding the FAT table takes, while those
transfer over.

You can read about exFAT on Wikipedia. It is supposed to be
better for the mechanics of flash. Only problem with the format,
is how many computers can read it ? WinXP can, if you add the
driver package. Vista+ can as far as I know. But any recipients
with Win98 computers won't be able to read it.

Give that ZIP idea a try. Store all 50000 files and folders
in a single zip. Remember to prepare the ZIP file on your
hard drive. Zip the C:\hugefolder --> C:\hugefolder.zip
Then, when ready, copy C:\hugefolder.zip to Q:\ flash stick.

No matter what format (FAT32/NTFS/exFAT) you use with
the 2GB flash sticks, transfers are bound to go faster
once you use a ZIP file. And, you have the choice
of compressing the contents when making the ZIP,
or just using "store" mode, with no compression.
If the files were PDF only, some PDFs are pretty well
compressed on their own. So not all content is compressible.
I was doing a text file, for testing purposes yesterday, and
with the best compression format available, got it down
to 20% of original size. The file was the "Sent" text
file from Thunderbird (news reader). I needed a file
I knew would compress, for benchmarking purposes. So
not only will the mechanics of flash storage work better
with the single file, but with the moderate compression
available in ZIP format, you can also reduce your transfer
time for the ZIP archive.

The more modern versions of Windows can open ZIP files on
their own, and treat them as folders. Which may be sufficient
for the job. Otherwise the tool from here is
excellent, if your users have a modicum of
computer skills ( http://www.7-zip.org/ ).
It's not that the tool is hard to use, but
that conceptually it takes a few minutes to get
used to the file-explorer-like interface.
Every computer I regularly use, has a copy
of that program, it's that good.

ZIP files can also be made self-extracting. Instead
of hugefolder.zip you get hugefolder.exe. The file
is executable, where the front of the file has a
small amount of executable code, while the back
end of the file holds all the data. This is for
situations where the recipient doesn't have the
decompressor. But such a scheme doesn't necessarily
help a user on Linux, as the code portion might be
for Windows. Tools can still be used to open such
a file, as the Archive Manager could likely handle
it OK.

So some thought has to go into knowledge of the audience,
computer platforms involved and so on. I would think
hugefolder.zip should work with most all of them,
and give them hours of fun. Many modern Windows can
open the thing and make it look like a folder. From
which you can extract it.

If you don't have winzip, then that 7-zip program
is free, and has a great many options for
compressing stuff.

Paul
 
F

Flasherly

On Thu, 04 Sep 2014 16:25:10 -0500, Charlie Hoffpauir

Discounting the time for something you can walk away and come back
later when the job's finished, then 25 USB ports going simultaneously
(with hubs). Under Windows that would just about wipe out any further
operations, as there are no more drive letters available -- short of
setting up virtual machines. Then there's a cumulative of what should
be mathematically expressible in each drive as it's added to an
overhead of USB2, or 3, for the sum bandwidth of increasingly slower,
longer individual drive copies as an aggregate's multiples increase.

Five drives practically though - that point or USB bandwidth limit may
not be even exceeded.

(So happens I was looking over old browsers, yesterday, and it's your
lucky day. Maybe.)

The software part:
http://www.oldapps.com/

A fan of Total Commander here - (may have went commercial, but) I see
some old versions. TC gives me my fastest copies/copy operations in my
experience. Copies (say then jobs) can be initiated - whereupon
actively then queued into a background process - for another and
subsequent, exact same or similar copy operand;- chaining them, for
all I know, ad infinitum (not sure a limit on how many "copy
windows," if any).

Another one I thought to try - TeraCopy (within MISC UTIL).
Interesting buffering programming options. Still, probably best to
get all your source material into memory -- on a virtual drive prior
to the operation. Looked good from what I saw, easy install and
simply run, although no realtime bandwidth speed reports. I'd rather
see than have to time it out for such, and I didn't keep the program.

You might like TC, though. (Don't worry about Total Commander - it's
a total asskicker of a program, everybody should have available in
their arsenal of file maintenance.)
 
C

Charlie Hoffpauir

Here's the situation... I have 1.5 GB of files, PDFs, Html, text, etc
to write to multiple flash drives. I have as many as 5 USB ports
available on my computer. I'll probably be using 2 GB USB 2 flash
drives for this. Each copy from my hard drive to a test USB flash
drive takes almost an hour. I suppose I could set up 5 copies (start 1
to the first drive, then start the second to the second drive, etc)
and the hard drive would send the data fast enough to keep up with all
5 copies, but I was wondering if there is a better way to do this? (my
guesstimate is about 5 hours to copy 25 drives using this method) Is
there a way to make one file write to 5 different locations? My
transfers will probably be done by using a batch file to do a Robocopy
copy of my master set, unless I can do a EASUS partition copy faster,
or find some other way. I'll only need to do this thing once a year,
and for about 25 USB drives, so it's not like a mass production
situation.

Another point, perhaps not pertinent, is that "some" of the files
would be the same every year, but probably "most" of the files would
be new or revised from the previous year. So this means using Robocopy
I could "pre-record" the files that stay the same year-to-year, to
reduce the copy time when I produce the final 25 "finished" copies.

Thanks for the comments and suggestions. I'll certainly try the varius
suggestions and see what looks best.

A bit more explanation.... This is intended to replace a CD that I've
used in past years. The CD is genealogy data, provided for the past 10
years at our family reunion.... and the data has grown too large for a
CD. Also, one of the "features" I intend to put on the flash drive is
essentially a web page with thousands of small html files. I'm sure
that's what is slowing down the process. In the past, I tried that
feature on the CD, but it ran so slow from the CD that I doubt many
people would use it. Another factor is the computer savy level of most
of the users.... it was a challenge to get the CD set up so that all
that was required of the user was to insert it. Zipping the files
might work if I can get it to extract automatically to the flash
drive.... I doubt I could get the users to extract to their computer
and then locate the files to run them. The featured program does run
very well from the flash drive.... just about as fast as from my hard
drive. If anyone is curious about the application, I have a slightly
different one on a web page at http://my.rootsmagic.com/CRHoffpauir/ .
The one on the flash drive is larger....

Another consideration... These CDs have been sold by our Family
Organization (a registered non-profit organization) for a nominal
price... that's why I haven't put the data on the web and provided a
link to it..... We could decide to simply forget the sale and give the
data away on the web instead..... it might make it easier for everyone
that way.
 
F

Flasherly

.... it was a challenge to get the CD set up so that all
that was required of the user was to insert it. Zipping the files
might work if I can get it to extract automatically to the flash
drive.... I doubt I could get the users to extract to their computer
and then locate the files to run them. The featured program does run
very well from the flash drive.... just about as fast as from my hard
drive.


Used to run into that with a "backup routine" I'd include with
computers I'd sell. Tried various automated methods to simplify what,
nevertheless, would as often turn into a house call to look over and
take care of a computer. Even so basic as a fully automated boot CD
was not a viable solution, apparently. People manifest, at least to
me, a wide range of levels of complexity and perception as to what
computers ought, or not, do within a person framework of their
ownership.

I could find acceptance in what occurs to people after I make it
happen. That they, themselves, cannot make it happen may be a forgone
conclusion if to attempt to show them how to make it happen, or find
out why or what is in the way of preventing that from happening. If
what they do not expect is to factor for ownership, they already may
have rejected other concepts for fundamental flaws, not conducive to
their perceived principles and rights of ownership.

Easy enough to imagine the churn on a likes of EMachines at BestBuy
derived from an errant Windows setting, say, for disillusionment
broadly to serve for enough disincentive to buy another computer if
not a tablet. Especially if a technican is charging $75 an hour,
isn't working on warrantee provisions, or for free because you're a
good enough person to be family.

Probably best to "test market" your efforts over a sampling of people,
to judge reactions for acceptance and solutions for when lacking. A
working methodological solution. I had to inspire people, show them
the positives they, too, can work with to beneficially apply. So and
such as I was provided with what I needed. I'd only keep computers
long enough to learn and then sell them, using money provided for
buying advancements in technology as it came out.

House calls, as mentioned, became tedious, though. I wrote a letter
to the World Council of IT Professionals and they took my advice and
invented the Cloud.
 
P

Paul

Charlie said:
Thanks for the comments and suggestions. I'll certainly try the varius
suggestions and see what looks best.

A bit more explanation.... This is intended to replace a CD that I've
used in past years. The CD is genealogy data, provided for the past 10
years at our family reunion.... and the data has grown too large for a
CD. Also, one of the "features" I intend to put on the flash drive is
essentially a web page with thousands of small html files. I'm sure
that's what is slowing down the process. In the past, I tried that
feature on the CD, but it ran so slow from the CD that I doubt many
people would use it. Another factor is the computer savy level of most
of the users.... it was a challenge to get the CD set up so that all
that was required of the user was to insert it. Zipping the files
might work if I can get it to extract automatically to the flash
drive.... I doubt I could get the users to extract to their computer
and then locate the files to run them. The featured program does run
very well from the flash drive.... just about as fast as from my hard
drive. If anyone is curious about the application, I have a slightly
different one on a web page at http://my.rootsmagic.com/CRHoffpauir/ .
The one on the flash drive is larger....

Another consideration... These CDs have been sold by our Family
Organization (a registered non-profit organization) for a nominal
price... that's why I haven't put the data on the web and provided a
link to it..... We could decide to simply forget the sale and give the
data away on the web instead..... it might make it easier for everyone
that way.

If you burned a DVD with the information, that would
give you 4.7GB of room. Instead of the 700MB of the CD.
Burning optical media, using ISO9660 as an intermediary
form, should mean no content dependencies - if there
were a lot of small files, you don't pay a "transfer rate"
price for them.

ZIPping the files, in "Store" mode without compression,
is a quick way to test the "real" transfer rate of
the USB stick. That's just to show the transfer
rate could go faster.

Using "dd" to clone a USB flash drive, is an easy
way to avoid paying a "small file, transfer rate"
price for the files. You could prepare one USB stick
(using as many hours as it takes) and make that USB
stick your "master". Others can be re-generated from
the USB stick with a command like this.

dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk4\Partition0

With the master device Harddisk3 in the computer, you
would first verify with "dd --list" that you have
the correct ID for the thing. The labeling of physical layer
storage in "dd", should be the same order as seen in
Disk Management. Once you've verified that Harddisk3
is the source device, you could plug in empty
Harddisk4, Harddisk5, ... and from separate command
prompt windows, do parallel copies. In the
same form as the example command. In four separate
Command Prompt windows, with four devices, I could do
this (one of these commands per Command Prompt window).

dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk4\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk5\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk6\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk7\Partition0

Given the transfer rate of USB2 and the slow write rate
of commodity USB2 sticks, that would just about use up
the capacity of one USB2 logic block. Some motherboards
(like mine) have two USB2 logic blocks, so about 60MB/sec
total bandwidth for this kind of work.

Due to a bug in "dd" port, where it doesn't reliably
detect the end of a USB flash stick, you can use
size parameters. For example "dd --list" gives the
actual physical layer storage size in bytes.
The product of the block_size and the count value,
should equal the total device size. That's if
cloning one entire USB flash to another entire USB
flash (same sized USB keys). You glue this onto
the end of the commands above, to help limit
the transfer command to an exact total byte count.
This example assumes the USB flash is 1,048,576,000 bytes.

bs=1048576 count=1000

If the device was older, perhaps you could use
quarter megabyte transfers, using these optional
parameters. The idea is, to attempt to deliver
a large enough block_size so that it matches
the internal preferences of the USB flash chip.
It's a lot better to use a larger block, than
the default of 512 bytes, which would be hard
on the USB flash stick.

bs=262144 count=4000

While there are probably further tweaks possible
with the command, I would just transfer the whole
(master) stick, to multiple slaves, all at the
same time.

I can see that storing a ZIP archive on the USB stick,
might only make guys like me happy. A person who
just wants to see genealogy data, doesn't want to
have to do any post processing. I mean, you could
engineer things, such that the USB stick "unpacked"
itself on the recipient's hard drive, but again,
not everyone will appreciate your largess. Some
people will be security conscious, and want to
"scan" the stick before touching it. Others won't
have a clue, and they'll just want to stick to
do everything for them. No matter what you do,
I'm sure somebody will complain about what they
got with the provided media.

Providing the USB stick, means the recipient is
being given read/write media. Whereas the DVD
or CD is considered read-only (without going
multisession or something). A recipient could
damage their stick, by fooling around with the
content (toss in desktop Trash Can etc). While
there are USB flash sticks with a write protect
switch on them, I think that concept went the
way of the dodo bird, and you likely could
not find one if you needed it. If there was a
switch, you'd flip the switch to read-only,
before placing in the Fedex package.

If you place a USB key in letter-mail, without a box
to protect it, it could get crushed. I used
to work in a Post Office, so I have some
experience crushing stuff for a living :)
No matter what media you use, it should be
protected.

While I haven't had any recent reports,
some shipping paths in the US, use radiation
based scanners on the items being shipped. There
were a couple reports of motherboards arriving
with no BIOS contents in the flash chip. While
I assume they don't use quite the dosage implied
by that any more, it might still be possible for a
USB key to get erased in transit. If a recipient
complains the computer "wants to format the stick",
that's just one possible outcome for the USB stick.
Most AV scanners will be giving that stick the
once-over, when it is inserted, but that should
not damage anything. And some OSes may just
decide to be finicky eaters, and barf when
given the USB stick. So expect some "tech support"
issues with this operation (charge $5 for stick,
provide $100 worth of over-the-phone tech
support - you know the type of people I'm
thinking of).

Paul
 
C

Charlie Hoffpauir

If you burned a DVD with the information, that would
give you 4.7GB of room. Instead of the 700MB of the CD.
Burning optical media, using ISO9660 as an intermediary
form, should mean no content dependencies - if there
were a lot of small files, you don't pay a "transfer rate"
price for them.

I haven't tried executing the html files from a DVD, but my guess is
that they wouldn't perform much if any faster than from a CD... and
the slowness of the display of the application from a CD was the first
reason we thought of changing to a USB stick.
ZIPping the files, in "Store" mode without compression,
is a quick way to test the "real" transfer rate of
the USB stick. That's just to show the transfer
rate could go faster.

Using "dd" to clone a USB flash drive, is an easy
way to avoid paying a "small file, transfer rate"
price for the files. You could prepare one USB stick
(using as many hours as it takes) and make that USB
stick your "master". Others can be re-generated from
the USB stick with a command like this.

dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk4\Partition0

With the master device Harddisk3 in the computer, you
would first verify with "dd --list" that you have
the correct ID for the thing. The labeling of physical layer
storage in "dd", should be the same order as seen in
Disk Management. Once you've verified that Harddisk3
is the source device, you could plug in empty
Harddisk4, Harddisk5, ... and from separate command
prompt windows, do parallel copies. In the
same form as the example command. In four separate
Command Prompt windows, with four devices, I could do
this (one of these commands per Command Prompt window).

dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk4\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk5\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk6\Partition0
dd if=\\?\Device\Harddisk3\Partition0 of=\\?\Device\Harddisk7\Partition0

Given the transfer rate of USB2 and the slow write rate
of commodity USB2 sticks, that would just about use up
the capacity of one USB2 logic block. Some motherboards
(like mine) have two USB2 logic blocks, so about 60MB/sec
total bandwidth for this kind of work.

Due to a bug in "dd" port, where it doesn't reliably
detect the end of a USB flash stick, you can use
size parameters. For example "dd --list" gives the
actual physical layer storage size in bytes.
The product of the block_size and the count value,
should equal the total device size. That's if
cloning one entire USB flash to another entire USB
flash (same sized USB keys). You glue this onto
the end of the commands above, to help limit
the transfer command to an exact total byte count.
This example assumes the USB flash is 1,048,576,000 bytes.

bs=1048576 count=1000

If the device was older, perhaps you could use
quarter megabyte transfers, using these optional
parameters. The idea is, to attempt to deliver
a large enough block_size so that it matches
the internal preferences of the USB flash chip.
It's a lot better to use a larger block, than
the default of 512 bytes, which would be hard
on the USB flash stick.

bs=262144 count=4000

While there are probably further tweaks possible
with the command, I would just transfer the whole
(master) stick, to multiple slaves, all at the
same time.

This looks like it would really be the way to go if we decide to use
the USB sticks. This is really good information for me, since I know
next to nothing about Linux (although I have played around with an
installation of Ubuntu on an old computer).
I can see that storing a ZIP archive on the USB stick,
might only make guys like me happy. A person who
just wants to see genealogy data, doesn't want to
have to do any post processing. I mean, you could
engineer things, such that the USB stick "unpacked"
itself on the recipient's hard drive, but again,
not everyone will appreciate your largess. Some
people will be security conscious, and want to
"scan" the stick before touching it. Others won't
have a clue, and they'll just want to stick to
do everything for them. No matter what you do,
I'm sure somebody will complain about what they
got with the provided media.
I think you've described things perfectly. Most of the users are
interested only in looking at the information. No extra processing,
and certainly nothing put on their computer permanently.
Providing the USB stick, means the recipient is
being given read/write media. Whereas the DVD
or CD is considered read-only (without going
multisession or something). A recipient could
damage their stick, by fooling around with the
content (toss in desktop Trash Can etc). While
there are USB flash sticks with a write protect
switch on them, I think that concept went the
way of the dodo bird, and you likely could
not find one if you needed it. If there was a
switch, you'd flip the switch to read-only,
before placing in the Fedex package.

If you place a USB key in letter-mail, without a box
to protect it, it could get crushed. I used
to work in a Post Office, so I have some
experience crushing stuff for a living :)
No matter what media you use, it should be
protected.

While I haven't had any recent reports,
some shipping paths in the US, use radiation
based scanners on the items being shipped. There
were a couple reports of motherboards arriving
with no BIOS contents in the flash chip. While
I assume they don't use quite the dosage implied
by that any more, it might still be possible for a
USB key to get erased in transit. If a recipient
complains the computer "wants to format the stick",
that's just one possible outcome for the USB stick.
Most AV scanners will be giving that stick the
once-over, when it is inserted, but that should
not damage anything. And some OSes may just
decide to be finicky eaters, and barf when
given the USB stick. So expect some "tech support"
issues with this operation (charge $5 for stick,
provide $100 worth of over-the-phone tech
support - you know the type of people I'm
thinking of).


Paul
Most of the CDs have been sold "at" the annual reunion, usually about
20-25 each year. However, I have sold a few via mail, which is no
problem with the CDs. Over the past 10 years, I've mailed about 25 of
them, and never had a problem.... but it does require we increase the
price to $15 each. Mailing the USB sticks is another complication I
hadn't taken into account... and it would probably mean a further
increase in cost.

I hadn't thought much about the possibility of a user accidentally
erasing the USB stick... but that would really be a problem. Before I
ralized how long it would take to record all those html files to the
stick, I just imagined "correcting" an erased USB stick by simply
re-recording the data to it from my laptop at the next reunion.
However, now I see that really isn't practical.

The more I know about what is required, the more I'm thinking I've
been looking at this wrong. The main pressure to go to a flash drive
was to enable the use of these html files, which are essentially a
(huge) family tree with photographs. I'm beginning to think we'd be
better served by sticking with the use of the CD for the PDF files,
and simply giving a link to the web page for accessing the html files
with photographs. I can still produce a few copies on flash drives for
any relatives who for whatever reason, don't have web access or only
low speed web access.

I really appreciate all the comments and suggestions... this has been
a good learning experience for me.

Charlie
 
P

Paul

Charlie said:
Most of the CDs have been sold "at" the annual reunion, usually about
20-25 each year. However, I have sold a few via mail, which is no
problem with the CDs. Over the past 10 years, I've mailed about 25 of
them, and never had a problem.... but it does require we increase the
price to $15 each. Mailing the USB sticks is another complication I
hadn't taken into account... and it would probably mean a further
increase in cost.

I hadn't thought much about the possibility of a user accidentally
erasing the USB stick... but that would really be a problem. Before I
ralized how long it would take to record all those html files to the
stick, I just imagined "correcting" an erased USB stick by simply
re-recording the data to it from my laptop at the next reunion.
However, now I see that really isn't practical.

The more I know about what is required, the more I'm thinking I've
been looking at this wrong. The main pressure to go to a flash drive
was to enable the use of these html files, which are essentially a
(huge) family tree with photographs. I'm beginning to think we'd be
better served by sticking with the use of the CD for the PDF files,
and simply giving a link to the web page for accessing the html files
with photographs. I can still produce a few copies on flash drives for
any relatives who for whatever reason, don't have web access or only
low speed web access.

I really appreciate all the comments and suggestions... this has been
a good learning experience for me.

Charlie

The "dd" program is available for Windows, and the syntax to
identify disks is a bit different than Linux. Still, I've used
this a lot for various projects. The only real bug it had,
was not detecting where the end of a USB flash key was.
And thus, the need to explicitly give block_size and count
for such cases.

http://www.chrysocome.net/dd

http://www.chrysocome.net/downloads/dd-0.6beta3.zip

That command is also helpful, if you ever need to back up
a hard drive with damaged file systems. Before you start
using tools to "repair" the disk. What that command doesn't
handle, is bad_blocks that return a CRC error. In those
cases, a special version called dd_rescue is used instead.
So there is a better program for emergencies. The "dd"
source quoted above, is for media where the device has
good blocks on it.

Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

Flash drives 5
Cheap USB flash drives 11
My Flash Memory is Write-Protected (?) 7
usb flash drive 6
flash drives 4
Kingston flash drive trouble 15
Biometric USB Drive Recommendations 8
128 GB usb flash drives 1

Top