Speeding up a cross-machine query

G

Guest

We have a number of queries that run across a foreign ODBC data provider
based on Pervasive 7. We manage additional data in our SQL Server database,
and have a number of Access queries that combine data from both using various
JOINs and WHERE filters on linked tables.

I think these queries take too long -- minutes in some cases. In particular
one recently went from taking perhaps 30 seconds to 4 minutes, the result of
making it a pass through. I would like to gain a fuller understanding of
these sorts of queries so I can apply my SQL knowledge to making these things
run faster.

One solution I have been thinking of is "importing" the Pervasive tables
into SQL Server. As I understand it, when you do this queries will be sent to
the interpreter in SQL Server, which will then apply it's own optimizer to
decide what "sub queries" to run on both databases. If so, it would seem that
the SQL Server is then doing the heavy lifting when it comes to queries,
anything we send it should do a reasonable job.

Is this essentially correct? If so, what does the system use for statistics?
Is there any downside to this approach? Should I be concerned about
installing ODBC drivers and linking data into SQL Server -- for instance,
what happens to SQL Server if the other system goes down?

The other question I have is about how Access handles these sorts of
queries. I would assume there is some sort of commonality between the various
bits of SQL interpreter code in the MS world (although, perhaps this is
foolish given the history of these products). I would guess that Access does
something similar to SQL Server, optimizing the subqueries, pulling in data
from both sources, and then doing a local join.

If this is the case it doesn't really answer which is faster. After all, SQL
Server is basically a big join engine, so I would guess it would be faster.
Likewise it's SQL interpreter might be able to do a better job of figuring
out the subqueries. The long and short is I really don't know WHAT happens in
these cases. Any pointers here?

And finally, is there some way to capture the actual SQL being sent by
Access to the two machines? If so, this might give us clues as to the
problems we're seeing. If there is such a feature, is the same available when
using links inside SQL Server?

Maury
 
A

adsl

Maury Markowitz said:
We have a number of queries that run across a foreign ODBC data provider
based on Pervasive 7. We manage additional data in our SQL Server
database,
and have a number of Access queries that combine data from both using
various
JOINs and WHERE filters on linked tables.

I think these queries take too long -- minutes in some cases. In
particular
one recently went from taking perhaps 30 seconds to 4 minutes, the result
of
making it a pass through. I would like to gain a fuller understanding of
these sorts of queries so I can apply my SQL knowledge to making these
things
run faster.

One solution I have been thinking of is "importing" the Pervasive tables
into SQL Server. As I understand it, when you do this queries will be sent
to
the interpreter in SQL Server, which will then apply it's own optimizer to
decide what "sub queries" to run on both databases. If so, it would seem
that
the SQL Server is then doing the heavy lifting when it comes to queries,
anything we send it should do a reasonable job.

Is this essentially correct? If so, what does the system use for
statistics?
Is there any downside to this approach? Should I be concerned about
installing ODBC drivers and linking data into SQL Server -- for instance,
what happens to SQL Server if the other system goes down?

The other question I have is about how Access handles these sorts of
queries. I would assume there is some sort of commonality between the
various
bits of SQL interpreter code in the MS world (although, perhaps this is
foolish given the history of these products). I would guess that Access
does
something similar to SQL Server, optimizing the subqueries, pulling in
data
from both sources, and then doing a local join.

If this is the case it doesn't really answer which is faster. After all,
SQL
Server is basically a big join engine, so I would guess it would be
faster.
Likewise it's SQL interpreter might be able to do a better job of figuring
out the subqueries. The long and short is I really don't know WHAT happens
in
these cases. Any pointers here?

And finally, is there some way to capture the actual SQL being sent by
Access to the two machines? If so, this might give us clues as to the
problems we're seeing. If there is such a feature, is the same available
when
using links inside SQL Server?

Maury
 
A

adsl

Maury Markowitz said:
We have a number of queries that run across a foreign ODBC data provider
based on Pervasive 7. We manage additional data in our SQL Server
database,
and have a number of Access queries that combine data from both using
various
JOINs and WHERE filters on linked tables.

I think these queries take too long -- minutes in some cases. In
particular
one recently went from taking perhaps 30 seconds to 4 minutes, the result
of
making it a pass through. I would like to gain a fuller understanding of
these sorts of queries so I can apply my SQL knowledge to making these
things
run faster.

One solution I have been thinking of is "importing" the Pervasive tables
into SQL Server. As I understand it, when you do this queries will be sent
to
the interpreter in SQL Server, which will then apply it's own optimizer to
decide what "sub queries" to run on both databases. If so, it would seem
that
the SQL Server is then doing the heavy lifting when it comes to queries,
anything we send it should do a reasonable job.

Is this essentially correct? If so, what does the system use for
statistics?
Is there any downside to this approach? Should I be concerned about
installing ODBC drivers and linking data into SQL Server -- for instance,
what happens to SQL Server if the other system goes down?

The other question I have is about how Access handles these sorts of
queries. I would assume there is some sort of commonality between the
various
bits of SQL interpreter code in the MS world (although, perhaps this is
foolish given the history of these products). I would guess that Access
does
something similar to SQL Server, optimizing the subqueries, pulling in
data
from both sources, and then doing a local join.

If this is the case it doesn't really answer which is faster. After all,
SQL
Server is basically a big join engine, so I would guess it would be
faster.
Likewise it's SQL interpreter might be able to do a better job of figuring
out the subqueries. The long and short is I really don't know WHAT happens
in
these cases. Any pointers here?

And finally, is there some way to capture the actual SQL being sent by
Access to the two machines? If so, this might give us clues as to the
problems we're seeing. If there is such a feature, is the same available
when
using links inside SQL Server?

Maury
 
A

adsl

Maury Markowitz said:
We have a number of queries that run across a foreign ODBC data provider
based on Pervasive 7. We manage additional data in our SQL Server
database,
and have a number of Access queries that combine data from both using
various
JOINs and WHERE filters on linked tables.

I think these queries take too long -- minutes in some cases. In
particular
one recently went from taking perhaps 30 seconds to 4 minutes, the result
of
making it a pass through. I would like to gain a fuller understanding of
these sorts of queries so I can apply my SQL knowledge to making these
things
run faster.

One solution I have been thinking of is "importing" the Pervasive tables
into SQL Server. As I understand it, when you do this queries will be sent
to
the interpreter in SQL Server, which will then apply it's own optimizer to
decide what "sub queries" to run on both databases. If so, it would seem
that
the SQL Server is then doing the heavy lifting when it comes to queries,
anything we send it should do a reasonable job.

Is this essentially correct? If so, what does the system use for
statistics?
Is there any downside to this approach? Should I be concerned about
installing ODBC drivers and linking data into SQL Server -- for instance,
what happens to SQL Server if the other system goes down?

The other question I have is about how Access handles these sorts of
queries. I would assume there is some sort of commonality between the
various
bits of SQL interpreter code in the MS world (although, perhaps this is
foolish given the history of these products). I would guess that Access
does
something similar to SQL Server, optimizing the subqueries, pulling in
data
from both sources, and then doing a local join.

If this is the case it doesn't really answer which is faster. After all,
SQL
Server is basically a big join engine, so I would guess it would be
faster.
Likewise it's SQL interpreter might be able to do a better job of figuring
out the subqueries. The long and short is I really don't know WHAT happens
in
these cases. Any pointers here?

And finally, is there some way to capture the actual SQL being sent by
Access to the two machines? If so, this might give us clues as to the
problems we're seeing. If there is such a feature, is the same available
when
using links inside SQL Server?

Maury
 
A

adsl

Maury Markowitz said:
We have a number of queries that run across a foreign ODBC data provider
based on Pervasive 7. We manage additional data in our SQL Server
database,
and have a number of Access queries that combine data from both using
various
JOINs and WHERE filters on linked tables.

I think these queries take too long -- minutes in some cases. In
particular
one recently went from taking perhaps 30 seconds to 4 minutes, the result
of
making it a pass through. I would like to gain a fuller understanding of
these sorts of queries so I can apply my SQL knowledge to making these
things
run faster.

One solution I have been thinking of is "importing" the Pervasive tables
into SQL Server. As I understand it, when you do this queries will be sent
to
the interpreter in SQL Server, which will then apply it's own optimizer to
decide what "sub queries" to run on both databases. If so, it would seem
that
the SQL Server is then doing the heavy lifting when it comes to queries,
anything we send it should do a reasonable job.

Is this essentially correct? If so, what does the system use for
statistics?
Is there any downside to this approach? Should I be concerned about
installing ODBC drivers and linking data into SQL Server -- for instance,
what happens to SQL Server if the other system goes down?

The other question I have is about how Access handles these sorts of
queries. I would assume there is some sort of commonality between the
various
bits of SQL interpreter code in the MS world (although, perhaps this is
foolish given the history of these products). I would guess that Access
does
something similar to SQL Server, optimizing the subqueries, pulling in
data
from both sources, and then doing a local join.

If this is the case it doesn't really answer which is faster. After all,
SQL
Server is basically a big join engine, so I would guess it would be
faster.
Likewise it's SQL interpreter might be able to do a better job of figuring
out the subqueries. The long and short is I really don't know WHAT happens
in
these cases. Any pointers here?

And finally, is there some way to capture the actual SQL being sent by
Access to the two machines? If so, this might give us clues as to the
problems we're seeing. If there is such a feature, is the same available
when
using links inside SQL Server?

Maury
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top