Data access pattern

G

Guest

Hi,
I'm using data access for several years now and a pattern that I always used
to use is to incapsulate data access in an object that every programmer must
use if she/he wants to reach the data store.

Following this patterns I succeded in managing the complexity (and migration
from various data access tecnologies) in a single point.
For me is a pain that every programmer must open the connection setting some
values and using in slightly differents ways the ado object model (slight
difference if it was ado or ado.net or ado.net 2.0).

I saw some years ago a project by Atif Aziz called DBMethods that let you
(with some simple modifications) obtain what I want in a clean way.
http://www.raboof.com/Projects/DBMethods/DBMethods.aspx

I'm wrong? Why this concept isn't used in ado* models?
 
G

Guest

Microsoft has released the Data Access Application Block to be a generic DAL.
You can download it for 1.x from the microsoft.com/downloads site. In the 2.0
Framework, more of the model is declarative, which further genericizing much
of the access.

---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************
 
G

Guest

I know this application block, but it has (for me) the same problem as other
data access objects.
Everytime you need to comunicate with the store you have to open the
connection, set some properties and excutr the command.

Why all those details aren't hidden from the programmer?
If I (as an 'enterprise business programmer') need the age of a person I
don't want to write things like

declare some the ado* objects
open a connection
set some attributes on the ado* objects
Call someExecuteMethod(the key to identify a person)
get the age
do some clean up of ado* objects

Instead I need something like:

Call GetThePersonAge(the key to identify a person, age)

With dbMethods everything in the GetThePersonAge is automatically generated
(but it isn't the point) from sql statement or sp and uses common objects to
deal with ado* stuff.

Consider for example a problem we had with the isolation level that ADO uses
when inside com+ (Serializable). To lower this level you have to set the
level prior of using a connection. If we use the ado* way we need to go
everywhere a connection is used and add this instruction and recompile almost
everything, having a single point of access we modify the access code and
everything is fine.

Is it only a problem of mine or this pattern is usefull?

Carlo
 
K

Kevin Yu [MSFT]

Hi Carlo,

First of all, I would like to confirm my understanding of your issue. From
your description, I understand that you need to get a DAL that can wrap the
data connection object inside instead of let each programmer open a
connection himself. If there is any misunderstanding, please feel free to
let me know.

Since the the Application Block for Data Access includes source code for
it. We can make some slight change to it and wrap the connection object
inside.

First, we add new memeber, which is a static connection reference. Then we
define some properties to set or get the connection. If all the methods
uses this connection by default, we can modify each method to use this
connection directly, instead of letting the caller of these methods pass a
reference to their connection objects.

HTH.

Kevin Yu
=======
"This posting is provided "AS IS" with no warranties, and confers no
rights."
 
G

Guest

Hi Kevin,
what I want to accieve is to separate the ado stuff from the business code.
So I want a method that incapsulate the data access plumbing.

Of course we can modify the application block, but we modify 'someone else
code' with all the problem associated.
If I modify 'this' application block I have to port those changes when the
new version of the application block will be released (I think that some
changes wiil be made to use ado.net 2.0).

What I wanted to know is why this 'pattern' isn't used in the application
block.

Carlo
 
S

Shawn B.

I've created a data-access-layer component that has two parts: the base
(which requires only a connectionstring, and then the subclass that must
provide a connectionstring in whatever manner it sees fit (usually
transparently).

DataTable result = null;
SqlDb data = new SqlDb();

result = data.ExecuteTable("select * from table");
// data.ExecuteDataSet("...");
// data.ExecuteReader("...");
// data.ExecuteNonQuery("...");
// data.ExecuteScalar("...");

data.AddParameter("@Param1", SqlDbType.Int, ParameterDirection.Input, 4,
param1Value);
data.CommandType = CommandType.StoreProcedure;
data.CommandText = "sp_Proc1";

result = data.ExecuteTable();
// data.ExecuteDataSet();
// data.ExecuteReader();
// data.ExecuteNonQuery();
// data.ExecuteScalar();

Internally, it actually manages opening and closing the connection itself in
one of two ways by using a ConnectionBehavior enum:

ConnectionBehavior.Always;
ConnectionBehavior.OnDemand;

Always means when the object is created until it is disposed the connection
is always active. This is not the default and I've never had to use this
feature. OnDemand means that the connection is opened as late as possible
and closed as early as possible encapsulating the actual data execution
internally, so you don't have to worry about it.

Is this the type of thing you are talking about? If so, it is easy to
create one. I've seen just about every publicly available data access
component/block and more complex alternatives such as ORM wrappers. Each
has a strength and a weakness. The strength in mine is its ease of use, the
weakness is that in its current incarnation it is database (sql server)
dependant. The strength of others I've seen is that it can handle any
database and the weakness of most others is that many (not all) of which
I've seen are not as simple to use or implement as mine.

In my new version that targets C# 2.0 it is 1,500 lines of code few (about
700 lines now), does the same thing, and supports Oracle, OleDb, SqlServer,
and Odbc connections natively. I am working on a "lite" version that
supports only the core execution types and some basic transaction support
(and cross-object-instance transaction support) and an "extended" version
that extends that functionality into what is most useful about my current
version, it has an internal "plug-in" mechanism so we can get some very
advanced diagnostics and run-time if we need to debug (configurable via an
XML doc and can be turned on/off).

The diagnostics plugins that I've written include: exact execution timer
indicating how long the execute commenced accurate to the 200ns on my 2GHz
P4, a complete dump of the first 10 (or more, configurable) results of a
data table or all tables in a dataset, a "reprentation" fo the stored
procedure that you can "copy/paste" to see exactly what the DAL executed to
get its results, complete with feedback on the return/output param if there
was one, and can get emails on SPs/Sql queries that failed for some reason
or another (such as SP/Table not found, invalid parameter name, etc.).

For readers, it is a little more complex, there is a configurable option
that will actually cause it to execute the command the first time
(internally) as a dataset so it can dump the results and then it'll execute
again as a reader so you can work with it. Keep in mind, this is only for
debugging and not intended for production use and is very useful for
debugging. There is a second (configurable) option that allows us to use a
"disconnected" reader (don't ask on the internals as it is hard to explain,
its a custom reader, er, actually, a DataSet that "look" like an
IDataReader) that executes and the it gets logged and returns the same
instance back to you except in this case you are not connected as in the
case of a real reader.

The SP diagnostics read can be "filtered" by SP name, parameter criteria
(whether @Param1 is Int and less than 15 And/Or @Param2 is Varachar and
contains "ledger" -- and similiar condition compounds) as well as whether it
is a DataTable, DataSet, Reader, or whatever that is being executed.

All of our SP have internally diagnostics in them, as well, and when the DAL
encounteres issues, the SP diagnostics tell us one thing, that things work
fine, then when we use the DAL dump and the "representation" of the SP
command and params that was executed, we see a different story and have, in
some cases, cut our debugging time from hours (what used to happen
frequently) to minutes.

Another plugin uses custom performance counters to track memory usage,
number of open connections, and connections created per second and so on.
We use it to track down memory leaks but it seems pretty solid.

Another tells us that if an execution took longer than n seconds to complete
(by default 6 seconds) then we get an email with all relevant information.

Anyway, since I'm not trying to write a book here I'll terminate this
message but if you are interested, I can share my new C# 2.0 version with
you (should work in 2003 since I'm not using any 2.0 specific features --
yet). The diagnostics aren't there yet because I need to come up with a
better plugin architecture that doesn't load the plugins 15,000 times a
minute as in the current one but nonetheless...


BTW: I've been using a similiar variant of this data access component in an
enterprise application that receives about 15,000 requests per minute about
10 hours of the day during peak times and there seems to be no major
performance issues that we've needed to concern ourselves with.


Thanks,
Shawn
 
K

Kevin Yu [MSFT]

Hi Carlo,

The Data Access Application Block is a .NET component that contains
optimized data access code that will help you call stored procedures and
issue SQL text commands against a SQL Server database.

The purpose for the App Block is to encapsulate performance and resource
management best practices for accessing Microsoft SQL Server databases,
which reduce the amount of custom code you need to create, test, and
maintain.

However, the App Block need to keep the flexibility. Because in some case,
not only one connection is used in the whole project. So it exposes the
interface to let you set the connection, and wrap all the SqlCommand and
SqlDataAdapter inside.

HTH.

Kevin Yu
=======
"This posting is provided "AS IS" with no warranties, and confers no
rights."
 
G

Guest

Hi Shawn,
interesting way to handle the connection behavior.
For Kevin: Shouldn't this 'pattern' be included in the data access block?
As you can guess I want to have included what *I* consider important in the
application block.

Anyway the connection hiding is only part of what I can achieve.

If you take (if not already done) in Aziz's DBMethods what you call is a
simple method that handles all the 'dirty' ado.net work.
The method source code is automatically generated by a stored procedure that
reads the stored procedure that you want to 'wrap' and generates the
appropriate method with every parameter needed.

Shawn, could you please share your 2.0 dal? (Did you have some place to put
it or.... my addrress foliniDOTcarloATcrevalDOTit).

Ciao
Carlo
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top