Data Access Layer (DAL) Design - Help please

  • Thread starter Andrew Stanford
  • Start date
A

Andrew Stanford

Hi,

I am designing an application that has "partitioned" data. For example, most
of the data for a record (e.g. a company record) is stored on an AS400 and
is read-only to the client application and our custom application stores the
rest of the data we require in SQL Server. The users need live access to the
portion data on the AS400 so want they see is up-to-date. For example, we
might get most information about a company (customer) from the AS400 with
the remaining fields (used by the custom application) stored in SQL server
(e.g. Company - Name, IDnumber, Address come from the AS400 and Phone,
ContactDetails are retrieved from SQL server)

I am wondering about the design of the DAL for this application and how to
approach it. I have a native .NET provider for both the AS400 and also of
course SQL Server. As I see it there are a couple of options;
a/ set the AS400 up as a linked server in SQL server and write a distributed
query that joins between the tables on the 2 different platforms (e.g.
Company - Name, IDnumber, Address come from the AS400 and Phone,
ContactDetails are retrieved from SQL server). This would in basically be a
SQL DAL from a development perspective and SQL Server would then be
responsible for passing the request through to the AS400 for processing and
joining the resultset. I am worried about doing it this way as performance
may suffer with the linked server only being able to access the AS400 using
OLEDB or ODBC.

b/ Write the DAL so that when a company record is requested it queries both
databases at the same time using their respective native .NET providers. In
other words the DAL would query both AS400 and SQL Server for their
respective pieces of data, then merge these into an object that can be sent
back to the business logic layer (BLL). I think this is basically doing a
manual join and while it would involve more programming in the DAL,
performance should be as fast as possible.

This must be a fairly common problem (Merging SQL data with AS400, Oracle,
Sybase etc) and I wonder if anybody out there has implemented something like
this before and what the result was.

Thanks in advance,
Andrew.
 
B

Bob Barrows [MVP]

Andrew said:
Hi,

I am designing an application that has "partitioned" data. For
example, most of the data for a record (e.g. a company record) is
stored on an AS400 and is read-only to the client application and our
custom application stores the rest of the data we require in SQL
Server. The users need live access to the portion data on the AS400
so want they see is up-to-date. For example, we might get most
information about a company (customer) from the AS400 with the
remaining fields (used by the custom application) stored in SQL
server (e.g. Company - Name, IDnumber, Address come from the AS400
and Phone, ContactDetails are retrieved from SQL server)

I am wondering about the design of the DAL for this application and
how to approach it. I have a native .NET provider for both the AS400

Errr, you left perhaps the ONLY relevant group out of your crosspost
(I'm adding it in now):
microsoft.public.dotnet.framework.adonet
and also of course SQL Server. As I see it there are a couple of
options;
a/ set the AS400 up as a linked server in SQL server and write a
distributed query that joins between the tables on the 2 different
platforms (e.g. Company - Name, IDnumber, Address come from the AS400
and Phone, ContactDetails are retrieved from SQL server). This would
in basically be a SQL DAL from a development perspective and SQL
Server would then be responsible for passing the request through to
the AS400 for processing and joining the resultset. I am worried
about doing it this way as performance may suffer with the linked
server only being able to access the AS400 using OLEDB or ODBC.

b/ Write the DAL so that when a company record is requested it
queries both databases at the same time using their respective native
.NET providers. In other words the DAL would query both AS400 and SQL
Server for their respective pieces of data, then merge these into an
object that can be sent back to the business logic layer (BLL). I
think this is basically doing a manual join and while it would
involve more programming in the DAL, performance should be as fast as
possible.

This must be a fairly common problem (Merging SQL data with AS400,
Oracle, Sybase etc) and I wonder if anybody out there has implemented
something like this before and what the result was.

My answer can only be: test both options and see which one is suitable.
 
S

sloan

When you have a merge situation, this is a good scenario (IMHO) to use the
BusinessLayer to merge the results.

One way:

Create a strong DataSet.
I'll use EmployeeDS as an example.

EmployeeDS has Department (table) and Employee table.

EmployeeDS
Employee

Department

let's add some columns:


EmployeeDS
Employee
EmployeeID
LastName
FirstName

EmployeeID ( from datasource 2) (don't put this in the strong
dataset definition, its already there)
Height
Weight

Department
DeptID
DeptName

Ok,

-----start---------------------
Employee
EmployeeID
LastName
FirstName

comes from datastore1.
-----end---------------------


-----start---------------------
Employee
EmployeeID ( from datasource 2)
Height
Weight

Department
DeptID
DeptName

comes from datastore2
-----end---------------------




public EmployeeDS MergeMultipleDataStoresUsingStrongDataSets()
{

//This is a good "biz rule" method
//It takes 2 datasets from 2 different db's and merges them into one.


EmployeeDataLayer datalayer;


EmployeeDS returnDataSetWithDataFromMoreThanOneDatabaseDS;


EmployeeDS ds1;
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();


EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore1");
ds2 = datalayer.GetEmployeeSubsetAndDepartment();



returnDataSetWithDataFromMoreThanOneDatabaseDS = ds1;

returnDataSetWithDataFromMoreThanOneDatabaseDS.Merge(ds2);



return returnDataSetWithDataFromMoreThanOneDatabaseDS;



}


Now, if you had a simpler case where Employee(s) and Department(s) were in
different datastores, the dataSet.MERGE() will work well.
I have found the .Merge works well on different tables (Employee and
Department), OR same table, but different rows ( EmpID 101-200 come from one
datastore, EmpID 1001-2001 come from another datastore. I call this "Row
Friendly Merging". But when you have EmpID , LastName , FirstName in one
datastore, and Height, Weight in another datastore, .Merge doesn't work
well.


Thus, using the example above, you might have to do something like this:


So to modify the example above, try this:


EmployeeDS ds1;
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();


EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore1");
ds2 = datalayer.GetEmployeeSubsetAndDepartment();

returnDataSetWithDataFromMoreThanOneDatabaseDS = ds1;


//EmployeeDS.Employee.EmployeeRow is actually a classname, auto created for
you by VS200x
foreach (EmployeeDS.Employee.EmployeeRow row in
returnDataSetWithDataFromMoreThanOneDatabaseDS )

{

int currentEmpID = row.EmployeeID;

DataRows[] ds2Rows = ds2.Select ("EmployeeID=" +
currentEmpID.ToString()); // find the same employee in the second dataset

if(ds2Rows.Count > 0) //match!
{
//You probably need to cast ds2Rows[0] to a strong row here //aka,
missing code

EmployeeDS.Employee.EmployeeRow secondDataStoreRow =
(EmployeeDS.Employee.EmployeeRow) ds2Rows[0] ; // cast it!

row.Height = secondDataStoreRow.Height;
row.Weight = secondDataStoreRow .Weight;
}

}



Then experiment with how to get the Dept(s) into
returnDataSetWithDataFromMoreThanOneDatabaseDS.
I don't know if there is an other load for

returnDataSetWithDataFromMoreThanOneDatabaseDS.Merge(ds2.Department) . ????
Aka, just merging in the Department rows


The above is the strong dataset method.

Going to custom business objects is along the same lines.


If you need caching, then
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();
clean that up some to call a method which gets fresh data if the cache is
empty, else return the cached data.


Those are some ideas.


I would strongly recommend the business logic/layer approach. Its gives you
easier deployment, easier maintainability, and better caching options I
believe.

You can find a skeleton solution/project at:
http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!140.entry

this is not specific to your quesiton, but rather an example of a N-Layered
application.



The best thing to do is code up a prototype.
You can use Northwind and just pretend that Customers are from one database,
and Orders are from a second database.
If you go with that, you can get my downloadable example, and start a
prototype quickly.
 
S

sloan

EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore1");//<< Wrong string param
here

should be

EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore2");


I am using the EnterpriseLibrary.Data, which allows you to name "instances"
of a connection string.
Which is also in 2.0, in the <connectionString> section, which again, allows
the connection string to be named.





sloan said:
When you have a merge situation, this is a good scenario (IMHO) to use the
BusinessLayer to merge the results.

One way:

Create a strong DataSet.
I'll use EmployeeDS as an example.

EmployeeDS has Department (table) and Employee table.

EmployeeDS
Employee

Department

let's add some columns:


EmployeeDS
Employee
EmployeeID
LastName
FirstName

EmployeeID ( from datasource 2) (don't put this in the strong
dataset definition, its already there)
Height
Weight

Department
DeptID
DeptName

Ok,

-----start---------------------
Employee
EmployeeID
LastName
FirstName

comes from datastore1.
-----end---------------------


-----start---------------------
Employee
EmployeeID ( from datasource 2)
Height
Weight

Department
DeptID
DeptName

comes from datastore2
-----end---------------------




public EmployeeDS MergeMultipleDataStoresUsingStrongDataSets()
{

//This is a good "biz rule" method
//It takes 2 datasets from 2 different db's and merges them into one.


EmployeeDataLayer datalayer;


EmployeeDS returnDataSetWithDataFromMoreThanOneDatabaseDS;


EmployeeDS ds1;
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();


EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore1");
ds2 = datalayer.GetEmployeeSubsetAndDepartment();



returnDataSetWithDataFromMoreThanOneDatabaseDS = ds1;

returnDataSetWithDataFromMoreThanOneDatabaseDS.Merge(ds2);



return returnDataSetWithDataFromMoreThanOneDatabaseDS;



}


Now, if you had a simpler case where Employee(s) and Department(s) were in
different datastores, the dataSet.MERGE() will work well.
I have found the .Merge works well on different tables (Employee and
Department), OR same table, but different rows ( EmpID 101-200 come from
one datastore, EmpID 1001-2001 come from another datastore. I call this
"Row Friendly Merging". But when you have EmpID , LastName , FirstName in
one datastore, and Height, Weight in another datastore, .Merge doesn't
work well.


Thus, using the example above, you might have to do something like this:


So to modify the example above, try this:


EmployeeDS ds1;
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();


EmployeeDS ds2;
datalayer = new EmployeeDataLayer("DataStore1");
ds2 = datalayer.GetEmployeeSubsetAndDepartment();

returnDataSetWithDataFromMoreThanOneDatabaseDS = ds1;


//EmployeeDS.Employee.EmployeeRow is actually a classname, auto created
for you by VS200x
foreach (EmployeeDS.Employee.EmployeeRow row in
returnDataSetWithDataFromMoreThanOneDatabaseDS )

{

int currentEmpID = row.EmployeeID;

DataRows[] ds2Rows = ds2.Select ("EmployeeID=" +
currentEmpID.ToString()); // find the same employee in the second dataset

if(ds2Rows.Count > 0) //match!
{
//You probably need to cast ds2Rows[0] to a strong row here //aka,
missing code

EmployeeDS.Employee.EmployeeRow secondDataStoreRow =
(EmployeeDS.Employee.EmployeeRow) ds2Rows[0] ; // cast it!

row.Height = secondDataStoreRow.Height;
row.Weight = secondDataStoreRow .Weight;
}

}



Then experiment with how to get the Dept(s) into
returnDataSetWithDataFromMoreThanOneDatabaseDS.
I don't know if there is an other load for

returnDataSetWithDataFromMoreThanOneDatabaseDS.Merge(ds2.Department) .
???? Aka, just merging in the Department rows


The above is the strong dataset method.

Going to custom business objects is along the same lines.


If you need caching, then
datalayer = new EmployeeDataLayer("DataStore1");
ds1 = datalayer.GetAllEmployees();
clean that up some to call a method which gets fresh data if the cache is
empty, else return the cached data.


Those are some ideas.


I would strongly recommend the business logic/layer approach. Its gives
you easier deployment, easier maintainability, and better caching options
I believe.

You can find a skeleton solution/project at:
http://sholliday.spaces.live.com/Blog/cns!A68482B9628A842A!140.entry

this is not specific to your quesiton, but rather an example of a
N-Layered application.



The best thing to do is code up a prototype.
You can use Northwind and just pretend that Customers are from one
database, and Orders are from a second database.
If you go with that, you can get my downloadable example, and start a
prototype quickly.






Andrew Stanford said:
Hi,

I am designing an application that has "partitioned" data. For example,
most of the data for a record (e.g. a company record) is stored on an
AS400 and is read-only to the client application and our custom
application stores the rest of the data we require in SQL Server. The
users need live access to the portion data on the AS400 so want they see
is up-to-date. For example, we might get most information about a company
(customer) from the AS400 with the remaining fields (used by the custom
application) stored in SQL server (e.g. Company - Name, IDnumber, Address
come from the AS400 and Phone, ContactDetails are retrieved from SQL
server)

I am wondering about the design of the DAL for this application and how
to approach it. I have a native .NET provider for both the AS400 and also
of course SQL Server. As I see it there are a couple of options;
a/ set the AS400 up as a linked server in SQL server and write a
distributed query that joins between the tables on the 2 different
platforms (e.g. Company - Name, IDnumber, Address come from the AS400 and
Phone, ContactDetails are retrieved from SQL server). This would in
basically be a SQL DAL from a development perspective and SQL Server
would then be responsible for passing the request through to the AS400
for processing and joining the resultset. I am worried about doing it
this way as performance may suffer with the linked server only being able
to access the AS400 using OLEDB or ODBC.

b/ Write the DAL so that when a company record is requested it queries
both databases at the same time using their respective native .NET
providers. In other words the DAL would query both AS400 and SQL Server
for their respective pieces of data, then merge these into an object that
can be sent back to the business logic layer (BLL). I think this is
basically doing a manual join and while it would involve more programming
in the DAL, performance should be as fast as possible.

This must be a fairly common problem (Merging SQL data with AS400,
Oracle, Sybase etc) and I wonder if anybody out there has implemented
something like this before and what the result was.

Thanks in advance,
Andrew.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top