Localizing SQL Server Data

L

LeonJ

Hello,

Problem, our user base has grown tremendously and the very user-friendly
front-end to a SQL Server backend is now taxing our 100mb local net. We are
a seasonal agricultural product distribution company so our busiest times
create the slowest response times ... not good!

So, I'm dreaming up the quickest fix I can think of. For the slowest and
most used reports and forms I think I could fairly quickly move the queries
to get data from a local db. If I pull the data from the server direct to
local tables (separate mdb, of course) and query from them, I may get faster
response time for the report/form and will cause less network traffic.

First, Is this a good idea? Why or why not?

Leon
 
J

Jeff Boyce

Leon

I may be missing something...

If you pull ALL the data needed to run a query down to the local level,
won't that be more data on the line than you'd pull if you returned ONLY the
data that meet the criteria in the query?

Regards

Jeff Boyce
Microsoft Office/Access MVP
 
B

bcap

I think that before solving the problem you'd best find out what the problem
is. What makes you so sure it's the network and not the server? What
version of SQL Server are you using? If it's MSDE or the Desktop Engine you
may be hitting the "rev limiter" which Microsoft built into it to
"encourage" you to spend some money!
 
L

LeonJ

I understand your point and will need to be careful to not migrate any
reports/forms to this method that are better done on the server because of
the extra data that would be sent.

There are a few augmenting facts. Many of the forms, especially, already
pull all, or almost all of the data. I didn't mention it, but I think the
processing on the server itself may be overloading it, too (I'm hoping
offloading some will help that out). And, my largest tables already have an
updated column which tracks whenever something changes, so I will only need
to pull the changed records for those tables.

Thanks
 
L

LeonJ

Access 2003 front end, SQL Server 7.0 backend. No $ to upgrade before
November (at the earliest) and I will have tons of questions and/or need to
hire someone when it comes to that process.

My hardware guys have indicated that we are frequently using all available
bandwidth (even after going to all switched connections from hubs last month).
 
M

Michel Walsh

The best way to limit the data transferred through the wire is to built
limit on the data required, right at the start of the form or report with an
appropriate WHERE clause (or criteria). After all, even your
'date-stamp_version' solution, this is what you intend to do: bring ONLY the
data that has been modified, isn't? So, instead of bringing ALL the data,
use the appropriate conditions which will limit the data.

If you have joins with one-to-many match, you MAY find it faster to bring
the data inside local table (tables in the front end) and then, make the
join based on those local tables. Indeed, if your join result in most of the
data being repetition (in order to fill the rectangular "look" of a join),
if you do the join on the Server, that means that the same repeated data
will be transmitted many times over the wire; while if you make the join
locally, the data is send WITHOUT any repetition over the wire, so, in the
end, you may save time:
- empty the temp (local) table,
- copy the COLUMNS you want (and only them) from the server into you
local tables, for ONLY the records you know that your query is likely to
use. THESE two limitations may sound like extra work for you, but it
generally worth every second of your time in FORCING you to impose yourself
to FIND limitation at THIS level.
- make a local query implying the local tables.

Note: that also means you don't use HUGE combo box list on your form: more
than 100 items in a combo list box is just as useful as having no list at
all (and doing the check of membership explicitly, through your code, rather
than relying on NOT IN LIST), and *IS* a huge burden on the network. That
is also even more important to have small sub-form: instead, have the end
user explicitly open another form to see the extra details, AS REQUIRED,
instead of showing it, uselessly (and at a very high cost).

Final note, a large portion of the network bandwidth can be used by
listening to audio coming from Internet. You should make a policy to NOT
listen to Internet audio/video, mainly if the bandwidth is small.



Vanderghast, Access MVP
 
B

bcap

How many users do you have on this system??? There's no denying that Access
can be quite profligate with it's data access, what with subforms and combo
boxes and so forth, but I find it very hard to believe that a database
application is swamping a 100Mbit network and yet not bringing the server to
it's knees.
 
J

Jeff Boyce

I don't know if this has any bearing on your situation...

A few years back I migrated all Access data to SQL-Server. There were a few
bumps in the process but none were insurmountable.

A few months after the applications were up and running with reasonable
performance, they suddenly took a nose-dive. Forms and reports that used to
open in a second were taking 20 seconds or more.

It took several days of trouble-shooting and the involvement of the network
gurus to identify the fact that a newly-installed network-level antivirus
update was man-handling every I/O Access used to talk with SQL-Server. Once
the applications were "whitelisted", the performance resumed as before.

Has your network seen any recent "improvements"?<g>

Good luck!

Regards

Jeff Boyce
Microsoft Office/Access MVP
 
L

LeonJ

The server's disk light is frequently active, but the server always reports
that there is unused physical memory. So, you may very well be right, but
there's not much I can do about it at this point.

I think my idea of offloading more stuff to local would help in either case.
I use hundreds of passthru queries to stored procedures, many of them very
complex, at this point. So, if I could get some of those to be local, it
could reduce demand on both the net and the server, I think.

A possibly related problem is that something between Access, ODBC, and SQL
Server occasionally causes tables/queries on the server to remain open and
ends up causing locks in situations that shouldn't happen (like a passthru
query locking up anything, after it completes).

Thanks for the interest,

Leon
 
B

bcap

A trick that I have used in the past is to create local copies, in the
front-end, of the largely-static tables which are often used to populate
combo boxes and list boxes. You can automatically rebuild the contents of
these tables when the application is launched, in order to ensure that they
are up-to-date. This obviously causes a one-off "hit" on the network and
server as the application is launching, but thereafter when forms load they
only need to query their record source from the server, everything else is
local.

Executing stored procedures isn't likely to have much impact on the network.
Unless they are returning recordsets, the only network traffic is a tiny
"EXEC" statement and an even-tinier return value. In terms of network
traffic, I'd have thought that the load caused by "localising" the data in
order to perform the same processing locally would make the situation
significantly worse. The server, however, is another matter.

I do feel, though, that if your assessment of the server load is based on
whether the disk light is on then you really have not yet achieved a proper
understanding of the cause(s) of your problem, so any remedial measures you
take are basically guesswork and may well have no effect, or indeed make
matters worse. My first move would be to properly understand what is
happening on the server, by use of the performance monitor, and by examining
the Windows and SQL Server logs. You might have a problem that is very easy
to fix (low disk space, fragmentation, sub-optimal virtual memory settings
or some such). Just as a f'rinstance, the last time a system at one of my
clients ground to a halt it turned out that the McAfee anti-virus updater
process on the database server had gone berserk and gobbled up 2Gb of
virtual memory i.e. all of it! The fix was as simple as restarting the
rogue process (this was after their network/hardware support people had told
them that the problem was an inadequate old server which needed upgrading or
replacing; this was two years ago, it's still running fine today).
 
L

LeonJ

There is not really a doubt that the server is very busy processing sql
server requests.

But, you are right, I can't quantify the exact cause. I really can't
understand why there is available memory reported, but significant disk
access. Where do I go from here to find out what's really going on? I
really don't have time to study the topic or know which book/website to use
to search for the answer. I also don't know how to approach finding the
right consultant (or if its appropriate to ask for help finding one on these
forums).

I think the concept of localizing the combo boxes could make a lot of sense,
but I'm not sure we will gain much.

Thanks for your help
 
B

bcap

The server being busy is not necessarily a problem, the question is whether
it's flat-out busy - and what exactly that means! One of the first
principles of performance troubleshooting is that at any given time there is
only *one* bottleneck - i.e. there is one resource that is being utilised to
it's maximum. The challenge is to find out what it is. Then, having
identified and relieved that bottleneck, you find that either performance is
now acceptable, or you now have to fix a different bottleneck.

The fact that a resource is sometimes used 100% does *not* mean that it is
the bottleneck. In an environment where anything worthwhile is occurring
*any* resource can spike at 100%, this is normal and not a problem. A
resource is only the bottleneck if it is at 100% for extended periods. This
is perhaps something you should discuss with your network people before
concluding that your network is overloaded. I'm not a networking expert so
there isn't really anything else I can tell you about troubleshooting
network performance.

As for the server:

(i) I'm not sure why you feel that disk access is a problem. Your database
resides on the disk and to read from and write to the database requires disk
access, regardless of how much free memory you may or may not have!
(ii) Nor does the availability of physical memory tell us very much. You
don't say how *much* physical memory is available (if it's, say, 5% then you
have a problem!), but in any case memory is far from the only potential
bottleneck. The main resources at work here are: physical memory; virtual
memory; disk; CPU; network. Any of them *could* be the bottleneck.
(iii) You mentioned that you are using SQL Server 7, but this didn't answer
the question about which *edition* of SQL Server 7. If it's MSDE, this has
a built-in "throttle" designed to get you to upgrade to a paid version.
Some people believe (incorrectly) that MSDE supports just 5 concurrent SQL
batches. This is not true, the way the throttle really works is that the
first eight concurrent SQL batches run normally, but as soon as you exceed
eight concurrent batches then disk access is deliberately slowed down so
that performance suffers. You may not have any budget to upgrade (although
this would be cheaper than the consultants for whom you apparently *do* have
budget) but you still need to identify the source of your problem before you
can begin to solve it. If you are hitting the MSDE "throttle" this will be
very clear from the SQL Server logs.
(iv) You don't say which version of Windoze is running on your server, which
means I don't know what tools you have to hand. But, a few things to look
for:
- do you have plenty of free disk space (including on the system drive, if
you have more than one or a partitioned drive)?
- somewhere in the Windoze menus you will most likely find "Disk
Defragmenter". Use it to find out how fragmented your disks are, and if
they need defragmenting then do it. This will slow the server down so you
may want to do it over a weekend. n.b. there's not much point in defragging
if you are low on disk space, get rid of some stuff first so that you have
plenty of space on all disks. If you have a version of Windoze without the
defragmenter, then buy a 3rd-party tool which does the same job.
- depending on the version of Windoze, you can get a lot of useful
information from the Task Manager. You may be able to see whether you are
maxed out on CPU, physical memory, page file utilisation and network
access. Remember, spikes are OK, it's extended periods at or near 100% that
are the problem.
- From the Windoze control panel, you should be able to find your way to the
Performance Monitor, which allows you to plot graphs of the utilisation of
every conceivable system resource.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top