Microsoft Data Access Application Block !!!!

S

Sahil Malik

Okay so I've been going thru the Microsoft Data Access Application block.

It says that the following are best practices bla bla .. but here are the
obvious problems atleast I see from my viewpoint.

a) Using SqlCommandBuilder to derive stored proc parameters. I thought
CommandBuilders were evil !!!
b) Insisting on using a Parameter cache - which is in turn stored in a
static hashtable - can I use this on a website, or in a multithreaded
application or a remtoing server that accepts many clients? In other words,
there's a bottleneck here - plus there is a scalability issue here.
(SqlParameterCache - urrghh !!!) Not to mention, those are stored in a
static hashtable - so no two stored procs I have will ever share the same
parameter name.
c) Ability to pass in a connection - so your database access layer doesn't
really prevent people from not closing connections?

All in all, so far I think I am gonna write my own data access layer. Anyone
disagree?

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
http://blogs.apress.com/authors.php?author=Sahil Malik
 
G

Guest

The Parameter cache is actually quite sweet once you get used to it. By
caching the params, you avoid having to match items up every time. You are
simply adding values to objects that are held in memory. I do not see this as
a problem with websites either, unless you have multiple stored procs per
page.

The bottleneck is a potential, depending on how you are connecting to the
Remoting server. In most cases, the Parameter Cache will not be a bottleneck
to performance.

The connection open and close is quite common, as the actual connection
objects are stored in a pool. It is a little more overhead over controlling
your own connection, but adds a safety layer. It also allows you to let
things fall out of scope (close connections) without losing the actual
connection object. Less useful if you use user credentials for db
connections, of course.

As far as coding your own goes, that is a decision you will have to make.
The App Block works nice for many applications, but it is rather generic.
From our previous conversations, it appears you have needs beyond what the
app block will do well. If I were you, I would consider what I did for one
client: Rip out the pieces of the application block that work for you and
code around the pieces that do not. This speeds up the process while allowing
you much more flexibility. Also, note the App block is a 1.x construct; as
you are working on 2.0, pieces of the data model (ADO.NET) change enough to
warrant putting the block aside (at least in some instances).

---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************
 
S

Scott

I agree. We use parameter caching extensively in a large intranet site with
great success. The only modification we made was to allow for clearing the
cache during development. Otherwise, your concerns simply haven't been a
problem.

Scott L.
 
S

Sahil Malik

Okay, I did some more research on the DataAccess application block - and I
don't hate it so much anymore.

I learnt a few things - static methods - an instance is created on each
thread - so they are not as bad as they are made out to be. Static module
level variables are a whole another deal though. So it's not going to be the
bottleneck I thought it would have been.

Parameter caching - yes that is sweet - and the command it executes isn't
that awful. I am probably going to replace the
CommandBuilder.DeriveParameters with my own Command instead.

I think I need to embellish the DAB with concurrency control - transactional
updates - and I'd have a fantastically good data layer ready for my project.

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top