Import extrenal data exceeds 65536

S

Stacey

I realize Excel has a row limitation of 64K here, and I've seen
solutions for reading a text file having > 65536 rows into Excel via a
macro and adding a new sheet after the 65536th row. But I am
attempting to retrieve a result set (from MSQuery) which has more rows
than Excel allows on one sheet and would like to have Excel
auto-generate a new sheet when the result set import exceeds the
limitation. Basically I'd like to import all rows from the result set
into Excel, using multiple sheets. Does any one have any sample code
for auto generating a new sheet based on resultSet data (rather than
reading from a text file?)

Thanks,
Stacey
 
G

Guest

I don't have any sample code to hand: I can suggest some ideas.

The Office Web Component (OWC) spreadsheet can accommodate 262,144 rows and
18,278 columns. It is can port its data to Excel provided that Excel limits
are not breached. Unfortunately, it does not have the CopyFromRecordSet
method so you'll have to write your recordset with code. OWC is licensed
software but can be downloaded from Microsoft for investigation.

Instead of MSQuery, you can use ADO to read your source. ADO has a pagesize
(no of rows) property which you can set to 65536 & then loop through the
PageCount property to retrieve all records.
 
T

Tim Williams

If you have code for a file then it could relatively easily be altered to
work with a recordset, but you would have to "hand code" the database
query - not sure MSQuery can handle this itself. You don't mention the DB
being used (assuming that's where the data is from).

One question might be what you intend to do with it once it's in Excel ? -
when split over multiple sheets it's not going to be very useable.

Tim.
 
G

Guest

Hi Stacey:

Stacey said:
I realize Excel has a row limitation of 64K here,
Are there more elswhere?
I am attempting to retrieve a result set (from MSQuery) which has more rows
than Excel allows on one sheet

Excel allows 65k+ rows in one column
you can use the other columns
Basically I'd like to import all rows from the result set
into Excel, using multiple sheets. Does any one have any sample code
for auto generating a new sheet based on resultSet data (rather than
reading from a text file?)

Well yes I do , using ADO and you could bring it in to the same
sheet or whereever you want it.

For Example:

This puts the first 65k + 3 field Query into col a b c

cnt = Rs.RecordCount '//get record count if you need them
MsgBox "Count = " & cnt

Dim irow As Long

irow = 3

With Rs
.MoveFirst
Do Until irow = 65536
Worksheets("Sheet3").Range("A" & irow) = Rs("ProductID")
Worksheets("Sheet3").Range("B" & irow) = Rs("ProductName")
Worksheets("Sheet3").Range("C" & irow) = Rs("UnitPrice")
Debug.Print Rs("ProductName")

.MoveNext
irow = irow + 1
Loop
End With

cnt2 = Rs.AbsolutePosition '// where we at in the recordset
MsgBox "Position = " & cnt2

'/// here we move to d e f for the balance of the recordset
'/// see the recordset pointer remembers it's position or you can
'/// get it with the above code "Rs.AbsolutePosition"

irow = 3
With Rs
Do Until .EOF
Worksheets("Sheet3").Range("D" & irow) = Rs("ProductID")
Worksheets("Sheet3").Range("E" & irow) = Rs("ProductName")
Worksheets("Sheet3").Range("F" & irow) = Rs("UnitPrice")

.MoveNext
irow = irow + 1
Loop
End With

That's one way, Stacey.

If it happens to be an Access db post back and I'll post you
some connection code that you can play around with.

By the way I brought 88.000, 3 field records into columns a b c
and d e f in 64 seconds on an old laptop 1.6 I think.

Good Luck
TK
 
J

Jamie Collins

TK said:
This puts the first 65k + 3 field Query into col a b c
<<snipped>>
By the way I brought 88.000, 3 field records into columns a b c
and d e f in 64 seconds on an old laptop 1.6 I think.

Here's an extension to this idea. Assuming the table has a key, you
could sort on it, navigate to row 65535 (or whatever), set an
appropriate Filter using the key values at rows 1 and 65535 so the
recordset now contains only 65535 or less rows, use GetRows or
CopyFromRecordset to read all (visible) rows to the worksheet, remove
the Filter and repeat until EOF. I do not have the code/database to
test (Products in my northwind has only 77 rows <g>) but I would
expect this to be faster than writing line-by-line if a suitable key
was available.

Jamie.

--
 
J

Jamie Collins

AA2e72E said:
I don't have any sample code to hand: I can suggest some ideas.

ADO has a pagesize
(no of rows) property which you can set to 65536 & then loop through the
PageCount property to retrieve all records.

Could you post some code that demonstrates how to this, please?

Many thanks,
Jamie.

--
 
G

Guest

Jamie Collins said:
Here's an extension to this idea. Assuming the table has a key, you
could sort on it, navigate to row 65535 (or whatever), set an
appropriate Filter using the key values at rows 1 and 65535 so the
recordset now contains only 65535 or less rows, use GetRows or
CopyFromRecordset to read all (visible) rows to the worksheet, remove
the Filter and repeat until EOF. I do not have the code/database to
test (Products in my northwind has only 77 rows <g>) but I would
expect this to be faster than writing line-by-line if a suitable key
was available.

Jamie.

--

Hi Jamie:

I see post all the time referring to the 65k limit in excel being
referred to incorrectly. It is, as you know a column limitation.
I posted the procedure merely to prove that point. By no means
do I advocate it, and I even posted the time it took. My
advise “for what it’s worth, if anything†has always been to narrow
the cursor at the db level.

Also, from my experience I don’t think the answerer is GetRows,
I have timed CopyFromRecordset and GetRows and
CopyFromRecordset was faster in the test and you don’t have an
array to work with if you want to read it back in.

Good Luck
TK
 
S

Stacey

Thank you all for the great responses. Your feedback is greatly
appreciated and is definately getting me on the right track. In
response to some of the open ends I left:

You don't mention the DB
being used (assuming that's where the data is from).
Database is SQLServer. MSQuery is just what Excel "seems" to use to
write the SQL statment with. I'm open to others.
One question might be what you intend to do with it once it's in Excel ? -
when split over multiple sheets it's not going to be very useable.
Hey, I don't make the demands, I just attempt to fill them.
Personally, how is 65K rows readable/usable?! I've used pivot tables
where I can to reduce the number of rows retrieved in hoping it is
more useful to the end user.

I'm not familiar with ADO but am seeing it could be very useful here
and am giving it a try.
Thanks again!
Stacey
 
G

Guest

Jamie:

I thought you might find this interesting.

"the db I used here is trash, just one I use to play around with
to many dups ect to be of value or to share 1 table 3 fields
thats why I couldn't set the value at 65+"

I wrote a simple select query in Access

"SELECT DISTINCTROW TOP 30500 Products.ProductID,
Products.ProductName, Products.UnitPrice FROM Products;"

Called the query from excel with the connection object,
used CopyFromRecordset to write it into a sheet it took

1.3 sec

Using the same select statement and CopyFromRecordset in excel
to write to a sheet it took

.59 sec

using a loop

strSql = "SELECT Products.ProductID, Products.ProductName,
Products.UnitPrice FROM Products;"
With Rs
.MoveFirst
Do Until irow = 30502
Worksheets("Sheet3").Range("A" & irow) = Rs("ProductID")
Worksheets("Sheet3").Range("B" & irow) = Rs("ProductName")
Worksheets("Sheet3").Range("C" & irow) = Rs("UnitPrice")

14.4 sec

So I guess my premise to do as much as you can at the db level
don't mean much. But, then what the hell, this is just a hobby.
Have a happy thanksgiving

TK
 
J

Jamie Collins

TK said:
I thought you might find this interesting.

I wrote a simple select query in Access

"SELECT DISTINCTROW TOP 30500 Products.ProductID,
Products.ProductName, Products.UnitPrice FROM Products;"

Called the query from excel with the connection object,
used CopyFromRecordset to write it into a sheet it took

1.3 sec

Using the same select statement and CopyFromRecordset in excel
to write to a sheet it took

.59 sec

I agree. Using a 60K rows by 100 column recordset, my results were:

5.438 secs using CopyFromRecordset
9.250 secs using Application.Transpose(rs.GetRows)

BTW when using I tried with 60K rows by 255 columns it was 14 secs for
CopyFromReocrdset whereas GetRows caused an error, 'Not enough storage
is available to complete this operation', for which I assume the
limitation lies with the Transpose() function.
Have a happy thanksgiving

We don't celebrate it in my country (UK), we go mad at xmas instead
<g>. But I'll enjoy the day, thanks, and hope you will too.

Jamie.

--
 
J

Jamie Collins

TK said:
I see post all the time referring to the 65k limit in excel being
referred to incorrectly. It is, as you know a column limitation.
I posted the procedure merely to prove that point.

I don't understand. The column limit (Jet) is 255 and your example
uses 3 columns therefore the recordset would need to be 16+ million
rows to hit the *column* limit. I assume you know about needing to
transpose the GetRows array to match Excel's (row,column) format so
what do you mean?

Thanks again,
Jamie.

--
 
G

Guest

Here's some code, annotated and tested: I have used a WORKBOOK as the source
data, modify the code to use another source.

Sub Partition2()
'Get the Source data
Cnn = "Provider=Microsoft.Jet.OLEDB.4.0;Data
Source=c:\ajay2.xls;Extended Properties=Excel 8.0;"
Sql = "SELECT * FROM [Sheet1]"
Set ADORS = CreateObject("ADODB.RecordSet")
'Number of records in a single sheet
ADORS.PageSize = 20000
ADORS.Open Sql, Cnn, 1, 3 ' adOpenKeySet, adLockOptimistic
'Create Excel Workbook ...
Set xl = CreateObject("Excel.Application")
'Hold user's default number of sheets
UserWBS = xl.SheetsInNewWorkbook
With ADORS
'... with the right number of worksheets to hold all records
xl.SheetsInNewWorkbook = .PageCount
xl.Workbooks.Add
xl.SheetsInNewWorkbook = UserWBS
xl.Sheets.Select
xl.Sheets(1).Activate
'Add field names to Row 1 in all sheets
For i = 0 To .Fields.Count - 1
xl.Cells(1, i + 1).Select
xl.Selection.Font.Bold = True
xl.Selection.FormulaR1C1 = .Fields(i).Name
Next
'Populate each sheet in turn
For i = 1 To .PageCount
xl.Sheets(i).Select
xl.Range("A2").CopyFromRecordset ADORS, .PageSize, 256
Next
xl.Sheets(1).Select
xl.Sheets(1).Range("A1").Select
End With
' Tidy up
ADORS.Close
Set ADORS = Nothing
' Make Excel visible, user must decide to save it or not
xl.Visible = 1
Set xl = Nothing
End Sub
 
G

Guest

Jamie

Thats is my point.
Stacey Wrote
...... Excel has a row limitation of 64K here, and I've seen
solutions for reading a text file having > 65536 rows into Excel via a
macro and adding a new sheet after the 65536th row. But I am
.attempting to retrieve a result set (from MSQuery) which has more rows
than Excel allows on one sheet

65k rows per column * (I thought it was 256 columns) but whose
counting or 16,777,216 cells

Also you might find this interesting, in playing around with
a 88k recordset CopyFRomRecordset can handle the output.

It does not fail it merly stops at 65,536 if you do not redirect
the output as in the example below.


'/Starts at A10 but does not stop at A10000
Worksheets("Sheet3").Range("A10:A1000").CopyFromRecordset Rs
'/If more than 65,536 records you must add a second line
'/Starts at D10 but does not stop F???
Worksheets("Sheet3").Range("D10:f10000").CopyFromRecordset Rs

Good Luck
TK
 
G

Guest

Jamie

TK said:
Jamie

Also you might find this interesting, in playing around with
a 88k recordset CopyFRomRecordset can handle the output.
It does not fail it merely stops at 65,536 if you do not redirect
the output as in the example below.

'/Starts at A10 but does not stop at A10000
Worksheets("Sheet3").Range("A10:A1000").CopyFromRecordset Rs
'/If more than 65,536 records you must add a second line
'/Starts at D10 but does not stop F???
Worksheets("Sheet3").Range("D10:f10000").CopyFromRecordset Rs

The above is not false it is just not that accurate in further test
the following also will bring 88.000, 3 field records into cols
A B C and the balance into D E F

Worksheets("Sheet3").Range("A:A,D:D").CopyFromRecordset Rs

Good Luck
TK
 
J

Jamie Collins

TK said:
you might find this interesting, in playing around with
a 88k recordset CopyFRomRecordset can handle the output.

It does not fail it merly stops at 65,536

Ah, this is something I hadn't appreciated i.e. that the current
record stops at the maximum row limit rather than proceed to EOF.
Thanks for that.
I thought it was 256 columns but whose counting

Excel has a 256 column limit. The column limit in a Jet *table* is 255
columns and AFAIK you cannot construct a *recordset* with more than
255 columns either when the source is Jet.

Jamie.

--
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top