Copy Unique Records ... Formula?

K

Ken

I know I can use "Advanced Filter" to copy Unique Records
to another location of spread sheet. Question is ... can
this be achieved with a formula?

Something like this in target cell ...
UniqueRecs(Range,Criteria)

I have several Excel 2000 templates where detailed data is
pasted from Access into the spread sheet ... I format this
data ... then to right of detail I wish to create various
summarizations. (ie: I have 10,000 operations, but only
150 PNs ... the 10,000 records are pasted into
Excel ... "Auto-Filter" Unique records gives the 150 PNs
which I can now write various summarization formulas
against ... However, what I would prefer to do ... Is to
go to the target Cell & simply have a formula there that
would automatically fill down the Unique records found
when the initial past of the detail occurs.

Can this be achieved??? ... Thanks ... Kha
 
H

Harlan Grove

See revised Method 1 described at

http://makeashorterlink.com/?P55021C06

and http://makeashorterlink.com/?T22A12286

which takes up the method JMay refers to.

Depending on what the OP's criteria may be, these may be either too much or too
little. Without any criteria other than no duplicates, there's no need for
ancillary calculations. If there were N distinct entries in List, then only N
formulas are required to extract the N distinct entries.

Top cell (N2):
=INDEX(List,1)

Next cell (N3): [array formula]
=INDEX(List,MATCH(0,COUNTIF(N$2:N2,List),0))

Fill N3 down until the formula evaluates as #N/A. If there are criteria
involved, e.g., pulling only entries from List corresponding to values over 50
in Amount, change the formulas to

Top cell (N2): [array formula]
=INDEX(List,MATCH(TRUE,Amounts>50,0))

Next cell (N3): [array formula]
=INDEX(List,MATCH(0,COUNTIF(N$2:N2,List)+1-(Amounts>50),0))

That said, wrapping Data > Filter > Advanced Filter, Copy, Unique inside a macro
would provide a more efficient runtime solution, especially if the selection
criteria were complex.
 
A

Aladin Akyurek

Harlan Grove said:
See revised Method 1 described at

http://makeashorterlink.com/?P55021C06

and http://makeashorterlink.com/?T22A12286

which takes up the method JMay refers to.

Depending on what the OP's criteria may be, these may be either too much or too
little. Without any criteria other than no duplicates, there's no need for
ancillary calculations. [...]

Method 1 sorts, eliminates duplicates and blanks from a 1D range. It's quite
fast. The link I quoted lists lots of methods. We should also add the
formula set you posted to the collection, along with the Paul Cumbers
method, enhanced for robustness as I suggested. Next step would be some
performance tests.
 
H

Harlan Grove

...
...
Method 1 sorts, eliminates duplicates and blanks from a 1D range. It's quite
fast. The link I quoted lists lots of methods. We should also add the
formula set you posted to the collection, along with the Paul Cumbers
method, enhanced for robustness as I suggested. Next step would be some
performance tests.

To be honest, I gave up trying to find method 1 in the morass of MrExcel. Does
it need only N cells to pull N distinct entries from a list? If not, it may be
faster, but it likely uses more memory, thus a trade-off.

As for time trials, I'd suspect a macro running an advanced filter extracting
'unique' values would be faster than most formula methods if the list involved
were large (10K or more records).
 
G

Guest

Hello, while this thread is dated, I have a cent or two to add here
I have several Excel 2000 templates where detailed data is pasted from Access into the spread sheet ... I format this data ... then to right of detail I wish to create >various summarizations. (ie: I have 10,000 operations, but only 150 PNs ... the 10,000 records are pasted into Excel ...

With this is mind, I think your best bet in terms of performance would be to perform the appropriate data streamlining (Unique fields or records) in Access eh. One would need to test to be certain though... Assuming you're using Access 2000, I'd pass a little sql to an ADO.Recordset. If your returned recordset is simply grabbing a single field, use 'Select Disstinct.' If you're grabbing multiple fields, you'll want to 'Group By' the field which is to be tested. Here's an example procedure which is actually run from Excel with the same concept

http://www.mrexcel.com/board2/viewtopic.php?t=7389

But, I think you'd be better off running a similarly constructed procedure in Access with Early Binding, for both performance and risk reduction

Food for thought, best with your quandary

Regards
Nate Oliver
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top