Uniblue

  • Thread starter Thread starter Twayne
  • Start date Start date
In
Jose said:
I have run many cleaners and now Uniblue. I did not pay for the full
cleaning capability (yes, you have to pay) though but did let it
report it's results.

I have my own "clean" registry and several dirty copies that contain
installations of lots of messy things like Visual Studio and SQL
Server which do not always uninstall cleanly. VS is so poor about
uninstalling, MS even has a KB about how to uninstall it to remove all
the miscellaneous stuff it leave behind in the registry. They know it
sucks at uninstalling.

My system runs just fine with all this stuff and it would go unnoticed
if you did not look for it. Things looks like they are uninstalled -
no folders, shortcuts, icons, etc., but there sure is a lot of junk
left behind in the registry. I use these dirty registries to see
which cleaners find and report the most junk correctly and which ones
miss it.

Uniblue is just "okay" at finding things but lacks some features of
others I prefer and also adds it's own entries to the registry that do
not get removed even when you uninstall it. It is unbelievably slack
and creates it's own registry info that does not get removed when you
uninstall it, so it rudely doesn't even cleanup after itself. I had
to say WTF at some of the stuff.

Other deficiencies with Uniblue are, you have one option - clean all.
You cannot click and "go to" the registry value and look at it to
decide if it makes sense or not. You can regedit your way to it from
the report, but that is time consuming. Uniblue missed a lot of
things other cleaners will find. It missed 1834 entries my preferred
inspection tool finds. They are not harmful things but are not
required in the registry. Uniblue is not the worst I have seen
though.

It is like malware canning programs. Nobody knows everything, some
will miss things others pick up.

My system runs fine with my VS and SQL Server "dirty" registry and I
can clean it up using the MS method and a few registry cleaners I can
use will also report the stuff. But the stuff in the registry takes
up space. Smebody has to look at it and decide what to do about it.

I like to find everything and be able to understand it and make my own
decision about what to do about it.

I figure that every thing in the registry that is not needed has to be
processed somehow - loaded, parsed and decided upon sooner or later
and maybe only once, but it takes time. It must take longer to load
and sort through a file with a bunch of junk in it than it does to use
a file that has less stuff in it.

I can follow up cleaning with use an optimizer and also defragment the
registry files and get smaller files - 50% smaller in my example
cases. Take my really dirty registry, clean it up, and it is half the
size. Does a smaller file take less time to process and load?

I am not afraid to try these things because they do not frighten me a
bit. I have never had a bad outcome using any registry cleaner and
can switch in a dirty registry in and out in seconds to test the next
cleaner. I don't know why they have such a bad reputation. I think
folks that have bad experiences inflict the damage on themselves
(usually with the process all buttons). Some cleaners are actually
quite revealing and to me, anything that loads or even needs a
decision about that is not necessary slows my system down.

However, in benchmark testing or boot times, the measurement of system
performance are not significant to the average user. One of the
dirtiest registries I have increases boot time by about 13 seconds
(and that is moderately dirty). But 13 seconds is a long time to me
and I can prove that I can take 13 seconds off a boot time with a
cleaner registry - every time. The dirty one still boots just fine
and the system runs great, but it is just slower to load.

Is it noticeable and worth it? Probably not to the average user, but
I measure things in time and if I can shave off a second or 5 or 10, I
am pleased with the results.

Uniblue - I'll never use it.

That's a good write-up Jose; thanks much for sharing!

Somehow I didn't catch that it was a pay-for; perhaps I was reading with my
blind eye<g>; wouldn't be the first time.

I wish I'd had the forethought to collect a few dirty registries as you've
done; I create my own by various sometimes nefarious means and
install/uninstall things. Occasionally I'll re-image the drive to virgin,
just installed/updated status too but I still don't save the registry. I can
see where it makes a great control for evaluations to keep copies around
though.

I've never had a cleaner do any damage either, but I think I have what I
hope is a healthy paranoia about me production machine being changed on me I
think I'll stay with my sandbox. Besides, I have no other use for that
machine so it's convenient; just re-image and go after pulling any useful
data off it.

I pretty much agree with everything you said actually, though not always for
the same exact reasons. I think the reasons are on the irrelevant side as
long as the same conclusions are reached and the logic holds. I can't think
of anything offhand to add to your post either.

I created a fresh disk image and got as far as starting their online scan
last night but the computer in my head crashed, so I killed it and left it
for today. However, based on your post, the things I've already noticed and
lack of anything very positive or negative in my searches, I'll probably
just skip it now, thanks to you.
I stil have a good week of things piled up from my hospital stay that
needs going through anyway, so ... .

Regards, & thanks again,

Twayne`
 
Jose said:
I have run many cleaners and now Uniblue. I did not pay for the full
cleaning capability (yes, you have to pay) though but did let it
report it's results.

I have my own "clean" registry and several dirty copies that contain
installations of lots of messy things like Visual Studio and SQL
Server which do not always uninstall cleanly. VS is so poor about
uninstalling, MS even has a KB about how to uninstall it to remove all
the miscellaneous stuff it leave behind in the registry. They know it
sucks at uninstalling.

My system runs just fine with all this stuff and it would go unnoticed
if you did not look for it. Things looks like they are uninstalled -
no folders, shortcuts, icons, etc., but there sure is a lot of junk
left behind in the registry. I use these dirty registries to see
which cleaners find and report the most junk correctly and which ones
miss it.

Uniblue is just "okay" at finding things but lacks some features of
others I prefer and also adds it's own entries to the registry that do
not get removed even when you uninstall it. It is unbelievably slack
and creates it's own registry info that does not get removed when you
uninstall it, so it rudely doesn't even cleanup after itself. I had
to say WTF at some of the stuff.

Other deficiencies with Uniblue are, you have one option - clean all.
You cannot click and "go to" the registry value and look at it to
decide if it makes sense or not. You can regedit your way to it from
the report, but that is time consuming. Uniblue missed a lot of
things other cleaners will find. It missed 1834 entries my preferred
inspection tool finds. They are not harmful things but are not
required in the registry. Uniblue is not the worst I have seen
though.

It is like malware canning programs. Nobody knows everything, some
will miss things others pick up.

My system runs fine with my VS and SQL Server "dirty" registry and I
can clean it up using the MS method and a few registry cleaners I can
use will also report the stuff. But the stuff in the registry takes
up space. Smebody has to look at it and decide what to do about it.

I like to find everything and be able to understand it and make my own
decision about what to do about it.

I figure that every thing in the registry that is not needed has to be
processed somehow - loaded, parsed and decided upon sooner or later
and maybe only once, but it takes time. It must take longer to load
and sort through a file with a bunch of junk in it than it does to use
a file that has less stuff in it.

I can follow up cleaning with use an optimizer and also defragment the
registry files and get smaller files - 50% smaller in my example
cases. Take my really dirty registry, clean it up, and it is half the
size. Does a smaller file take less time to process and load?

I am not afraid to try these things because they do not frighten me a
bit. I have never had a bad outcome using any registry cleaner and
can switch in a dirty registry in and out in seconds to test the next
cleaner. I don't know why they have such a bad reputation. I think
folks that have bad experiences inflict the damage on themselves
(usually with the process all buttons). Some cleaners are actually
quite revealing and to me, anything that loads or even needs a
decision about that is not necessary slows my system down.

However, in benchmark testing or boot times, the measurement of system
performance are not significant to the average user. One of the
dirtiest registries I have increases boot time by about 13 seconds
(and that is moderately dirty). But 13 seconds is a long time to me
and I can prove that I can take 13 seconds off a boot time with a
cleaner registry - every time. The dirty one still boots just fine
and the system runs great, but it is just slower to load.

Is it noticeable and worth it? Probably not to the average user, but
I measure things in time and if I can shave off a second or 5 or 10, I
am pleased with the results.

Uniblue - I'll never use it.

There is an article here on the registry.

"Inside the Registry" By Mark Russinovich

http://technet.microsoft.com/en-us/library/cc750583.aspx

One thing that strikes me, in the description -

"To optimize searches for both values and subkeys, the Configuration
Manager sorts subkey-list cells and value-list cells alphabetically.
Then, the Configuration Manager can perform a binary search when
it looks for a subkey within a list of subkeys."

A binary search is O(log n).

http://en.wikipedia.org/wiki/Binary_search

What that means is, if the registry swells by 5%, the number of binary
steps to find something, hardly changes at all. One extra step would
be required, if the registry doubled in size.

It is possible, the binary search step is only a tiny fraction of
other operations needed, so it may not be the dominant thing affecting
performance.

You'll also notice in that article, that memory is used extensively to
hold registry activities, so the disk doesn't have to be consulted for
everything. There is write activity, from things like the lazy writer,
so there is still disk activity. But the disk activity doesn't necessary
come from an attempt to find a key in the registry.

"When the Configuration Manager reads a Registry hive into memory, it
can choose to read only bins that contain cells (i.e., active bins)
and to ignore empty bins."

That means, once a hive is read into memory, fragmentation is going to
have less of an effect (if it even had an effect in the first place).

Now, if I look at an article like this, this isn't about "cleaning"
the registry. This one is about compaction.

http://www.informit.com/articles/article.aspx?p=1351988&seqNum=2

One of the Auslogics screenshots, shows the registry file shrinking
in size by 10%, and Auslogics says *something* is now 6% faster. I'd
want independent confirmation of that. And also an explanation
as to whether that is a 6% saving only during loading of the file,
or 6% saving at all times (like doing the binary search).

I think a motivated person, a promoter of registry fiddling, should be
able to create a test case such as the following.

1) Prepare a tool to exercise the registry. It would consist of
doing key lookups at random for values known to be in the registry.
(That would test the binary search thing.) A second step would be
to time the addition of a large number of keys. A third step would
be to remove the keys again, timing how long that takes. That
could be done with .reg files (pre-loading them into the file
cache might prevent disk I/O from being a factor).

2) Run the magic tool, whether it is a "cleaner" or a "compactor".

3) Rerun the test in (1) again.

If the testing tool undid all the changes it made, it should be
relatively safe. The testing tool is preferable to doing
something like installing Visual Studio, because that would pollute
the effort, with a lot of file I/O. Whatever method is used to
test the registry, it shouldn't involve a lot of its own file
I/O, so that the testing concentrates on the registry.

My efforts to find any information of value have been hampered by
all the advertising for registry cleaners. It is pretty hard to
craft a search expression with the word "registry" in it, without
a lot of adverts for cleaners. I'd prefer to find a few more articles,
like that one by Russinovich.

At least that article cleared up one thing for me. The registry isn't
built on a database engine. It looks a lot more "home brew", than I
was expecting. If you look at the impact SQLite has on Firefox, maybe
that is a good thing (SQLite being a database engine, and having
a significant impact on how fast Firefox starts.)

Paul
 
In
Paul said:
There is an article here on the registry.

"Inside the Registry" By Mark Russinovich

http://technet.microsoft.com/en-us/library/cc750583.aspx

One thing that strikes me, in the description -

"To optimize searches for both values and subkeys, the Configuration
Manager sorts subkey-list cells and value-list cells
alphabetically. Then, the Configuration Manager can perform a
binary search when it looks for a subkey within a list of subkeys."

A binary search is O(log n).

http://en.wikipedia.org/wiki/Binary_search

What that means is, if the registry swells by 5%, the number of binary
steps to find something, hardly changes at all. One extra step would
be required, if the registry doubled in size.

It is possible, the binary search step is only a tiny fraction of
other operations needed, so it may not be the dominant thing affecting
performance.

You'll also notice in that article, that memory is used extensively to
hold registry activities, so the disk doesn't have to be consulted for
everything. There is write activity, from things like the lazy writer,
so there is still disk activity. But the disk activity doesn't
necessary come from an attempt to find a key in the registry.

"When the Configuration Manager reads a Registry hive into memory,
it can choose to read only bins that contain cells (i.e., active
bins) and to ignore empty bins."

That means, once a hive is read into memory, fragmentation is going to
have less of an effect (if it even had an effect in the first place).

Now, if I look at an article like this, this isn't about "cleaning"
the registry. This one is about compaction.

http://www.informit.com/articles/article.aspx?p=1351988&seqNum=2

One of the Auslogics screenshots, shows the registry file shrinking
in size by 10%, and Auslogics says *something* is now 6% faster. I'd
want independent confirmation of that. And also an explanation
as to whether that is a 6% saving only during loading of the file,
or 6% saving at all times (like doing the binary search).

I think a motivated person, a promoter of registry fiddling, should be
able to create a test case such as the following.

1) Prepare a tool to exercise the registry. It would consist of
doing key lookups at random for values known to be in the registry.
(That would test the binary search thing.) A second step would be
to time the addition of a large number of keys. A third step would
be to remove the keys again, timing how long that takes. That
could be done with .reg files (pre-loading them into the file
cache might prevent disk I/O from being a factor).

2) Run the magic tool, whether it is a "cleaner" or a "compactor".

3) Rerun the test in (1) again.

If the testing tool undid all the changes it made, it should be
relatively safe. The testing tool is preferable to doing
something like installing Visual Studio, because that would pollute
the effort, with a lot of file I/O. Whatever method is used to
test the registry, it shouldn't involve a lot of its own file
I/O, so that the testing concentrates on the registry.

My efforts to find any information of value have been hampered by
all the advertising for registry cleaners. It is pretty hard to
craft a search expression with the word "registry" in it, without
a lot of adverts for cleaners. I'd prefer to find a few more articles,
like that one by Russinovich.

At least that article cleared up one thing for me. The registry isn't
built on a database engine. It looks a lot more "home brew", than I
was expecting. If you look at the impact SQLite has on Firefox, maybe
that is a good thing (SQLite being a database engine, and having
a significant impact on how fast Firefox starts.)

Paul

Was there a point to all that Paul?
 
Twayne said:
Was there a point to all that Paul?

Yes.

1) Does anyone in these discussions understand how the
registry "database engine" actually works ? And what
percentage of time or compute cycles on average, involve
accesses to that engine ?

2) Is there any article which benchmarks the database
engine, to show how sensitive it is to size, compaction
and the like ? Rather than just accepting what the
registry product claims on its status screen ?

Most of the discussions I see about the Registry, seem
to involve "feel good" comparison. Even if I look for
web articles, they still just accept what the tool
tells them, of some percentage improvement.

http://www.informit.com/articles/article.aspx?p=1351988&seqNum=2

How many problems posted in these groups, come from
people who buy products to "polish" their OS, and then
live to regret it ? At the very least, there should be
an honest expression about what is known or not known.
As in "I use registry cleaners to remove entries that
don't look correctly formed" but "I don't know whether
it makes any difference in the long run". At least then,
the potential person buying these things, has an honest
appraisal of what is known.

As opposed to just accepting every claim made by the
product manufacturer.

*******

One thing which has bothered me about this topic, is
reading articles like the one by Russinovich, where you
can see there was some care and attention to efficiency
in the design. And then, doing a search on the registry
with Regedit, and it takes an eternity. There seems to be
s disconnect between the two (the theory and the observation).
The performance on a search, isn't consistent with something
which is mainly stored in memory. It should run a lot faster
than it does. And I've yet to see explanations for that behavior.

Paul
 
Paul said:
Yes.

1) Does anyone in these discussions understand how the
registry "database engine" actually works ? And what
percentage of time or compute cycles on average, involve
accesses to that engine ?

2) Is there any article which benchmarks the database
engine, to show how sensitive it is to size, compaction
and the like ? Rather than just accepting what the
registry product claims on its status screen ?

Most of the discussions I see about the Registry, seem
to involve "feel good" comparison. Even if I look for
web articles, they still just accept what the tool
tells them, of some percentage improvement.

http://www.informit.com/articles/article.aspx?p=1351988&seqNum=2

How many problems posted in these groups, come from
people who buy products to "polish" their OS, and then
live to regret it ? At the very least, there should be
an honest expression about what is known or not known.
As in "I use registry cleaners to remove entries that
don't look correctly formed" but "I don't know whether
it makes any difference in the long run". At least then,
the potential person buying these things, has an honest
appraisal of what is known.

As opposed to just accepting every claim made by the
product manufacturer.

*******

One thing which has bothered me about this topic, is
reading articles like the one by Russinovich, where you
can see there was some care and attention to efficiency
in the design. And then, doing a search on the registry
with Regedit, and it takes an eternity. There seems to be
s disconnect between the two (the theory and the observation).
The performance on a search, isn't consistent with something
which is mainly stored in memory. It should run a lot faster
than it does. And I've yet to see explanations for that behavior.

Paul

I'd say the difference is like driving to a known address, vs. driving
around looking for a blue house with white shutters and a detached garage.
 
Back
Top