Passing Business Objects through nTier Web App

  • Thread starter Thread starter Stuart Hilditch
  • Start date Start date
Stuart,

One thing you haven't given any indication of is the scale or requirements
for the project. In practice I think that the appropriate answer to tackling
this sort of problem depends on answers to questions such as:

- How big a project is this?
- How important is flexibility in future?
- Do you need ot have the DAL as a full blown tier (distributable) or is it
simply an internal software layer?
- What is your testing approach - does the DAL need to be replaceable for
testing?

I'd probably come up with very different answers for a 100 user one-off
project than for something that was going to be the key business application
for the next decade. I think all projects should have some sort of
'complexity budget' - where complexity in this case is measured as the ratio
of 'infrastructure' code to useful application logic. As you add structure
(to reduce complexity in your application logic) you add complexity in extra
code. So it's a tradeoff of whether the benefit outweighs the cost.

So for a really simple project where the DAL was an internal layer I might
just break the layering rule and have the DAL create my busness objects. I'd
keep some separation between the database logic and the 'mapping' component,
per laimis's ideas, but I wouldn't worry too much about making it perfect.

Personally I'd prefer this 'impure' layering to having to have all my
business logic in static methods - that's too high a price to pay. I want to
have my business objects enapsulate data and behaviour, not separate them
out for artificial reasons.

If you want to stick to strict layering, there are two basic approaches:
- pass the data between layers in some simple shared structure, and copy in
the BL to the business objects
- use some form of inversion of control or factory class to pass the
necessary logic into the DAL to allow it to create business objects without
knowing about them

In the first case, use DataSets or custom-defined Data Transfer Objects.
This works well for remoting, as well.

The inversion of control solutions generally seem too complex for me - if
it's getting that complex then I'd be thinking of a full blown O/R mapper
instead of a DAL. However one simple variation that I've considered but not
used in practice is to use strongly typed datasets as the basis for the
business objects in the BL and pass in datasets to the DAL. In the style
'here's a dataset - please fill it for me'. The DAL works against simple
untyped dataset interfaces, but the dataset insfrastructure creates the
approriate strongly typed objects.

Good luck..

Nigel

Stuart Hilditch said:
I did originally use interfaces in the DAL, but it resulted in so much
messy code (some of my object have 25+ properties) I decided that it was
far cleaner to simply reference the object from the ORL.

- Stu

John B said:
Stuart said:
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it
then, or alternatively pass the method field values one by one (not
practical with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work
properly you need to have the objects themselves defined in the ORL, but
the methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu
Google Dependency Inversion Principle and Interface Segregation
Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with that
and the BLL classes implement this interface.

HTH
JB
 
Thanks very much Nigel,

That's great feedback and it's given me something to think about. I do agree
that being limited to static methods only is a price to pay, I think i might
start looking into to transferring these objects from the BLL to the DAL by
way of a dataset, it sounds like a practical option.

Thanks again!

Nigel Norris said:
Stuart,

One thing you haven't given any indication of is the scale or requirements
for the project. In practice I think that the appropriate answer to
tackling this sort of problem depends on answers to questions such as:

- How big a project is this?
- How important is flexibility in future?
- Do you need ot have the DAL as a full blown tier (distributable) or is
it simply an internal software layer?
- What is your testing approach - does the DAL need to be replaceable for
testing?

I'd probably come up with very different answers for a 100 user one-off
project than for something that was going to be the key business
application for the next decade. I think all projects should have some
sort of 'complexity budget' - where complexity in this case is measured as
the ratio of 'infrastructure' code to useful application logic. As you add
structure (to reduce complexity in your application logic) you add
complexity in extra code. So it's a tradeoff of whether the benefit
outweighs the cost.

So for a really simple project where the DAL was an internal layer I might
just break the layering rule and have the DAL create my busness objects.
I'd keep some separation between the database logic and the 'mapping'
component, per laimis's ideas, but I wouldn't worry too much about making
it perfect.

Personally I'd prefer this 'impure' layering to having to have all my
business logic in static methods - that's too high a price to pay. I want
to have my business objects enapsulate data and behaviour, not separate
them out for artificial reasons.

If you want to stick to strict layering, there are two basic approaches:
- pass the data between layers in some simple shared structure, and copy
in the BL to the business objects
- use some form of inversion of control or factory class to pass the
necessary logic into the DAL to allow it to create business objects
without knowing about them

In the first case, use DataSets or custom-defined Data Transfer Objects.
This works well for remoting, as well.

The inversion of control solutions generally seem too complex for me - if
it's getting that complex then I'd be thinking of a full blown O/R mapper
instead of a DAL. However one simple variation that I've considered but
not used in practice is to use strongly typed datasets as the basis for
the business objects in the BL and pass in datasets to the DAL. In the
style 'here's a dataset - please fill it for me'. The DAL works against
simple untyped dataset interfaces, but the dataset insfrastructure creates
the approriate strongly typed objects.

Good luck..

Nigel

Stuart Hilditch said:
I did originally use interfaces in the DAL, but it resulted in so much
messy code (some of my object have 25+ properties) I decided that it was
far cleaner to simply reference the object from the ORL.

- Stu

John B said:
Stuart Hilditch wrote:
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer
(DAL), Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the
DAL, let's call this method DAL.AddCustomer(BLL.Customer myCustomer)
which would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL
needs to reference the BLL (to receive the custom business object),
hence a circular referencing error. I understand that I could turn this
custom object into some sort of generic object[] or collection and pass
it then, or alternatively pass the method field values one by one (not
practical with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work
properly you need to have the objects themselves defined in the ORL,
but the methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to
work effectively any other way.

I would appreciate any advice.

- Stu
Google Dependency Inversion Principle and Interface Segregation
Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with
that and the BLL classes implement this interface.

HTH
JB
 
Stuart Hilditch said:
Hi Michael,

Sounds like your doing the same thing I am only using datasets rather than
business objects. I would have thought that your application would take a
massive performance by using datasets in this way, especially if you use a
lot of objects. I suppose you can help with caching (if it's an option).

- Stu

Hi Stu,

Why would you think there would be a massive performance hit? Where do you
think that would occur? I know datasets can be slow if they are loaded with
thousands of records, but I'm designing my app to avoid scenarious like
that.

I chose to use datasets for several reasons:

1.) They pass easily through web services. You can pass custom types,
but it requires manually changing the reference.cs file each time.
2.) They have built-in support for remembering which rows get added,
edited, deleted, etc. This is especially true in a data grid. If you pass
your data object to a data grid, how do you know which rows got updated?
3.) They allow for very easy data binding on Windows forms.
4.) You can give custom sql scripts to a data adapter, then let it do all
of the data manipulation work for you. There is no need to loop through
each record manually and figure out whether it needs to be added, deleted,
etc.
5.) You can use DataSet1.GetChanges() to only pass only the changes to
your web service and then to your data layer.
...

Thanks,

Mike
 
Hi Mike,

I would tend to think they were better in cases where you use thousands of
records.

I agree they are very handy, but there is a very instantiation & marshalling
cost that is incurred everytime you create a new object.

From
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/BOAGag.asp

"High instantiation and marshalling costs. DataSets result in the creation
of several subobjects (DataTable, DataRow, and DataColumn), which means that
DataSets can take longer to instantiate and marshal than XML strings or
custom entity components. The relative performance of DataSets improves as
the amount of data increases, because the overhead of creating the internal
structure of the DataSet is less significant than the time it takes to
populate the DataSet with data."

I would not use them for individual objects, but for large sets of data they
are great.

When retreiving data from a datastore, a datareader is the way to go if
performance is a consideration. Check out this benchmark ...

http://www.devx.com/vb2themax/Article/19887/1954?pf=true

- Stu
 
Hi Stu,

I've read those benchmarks and it did give me some concern. However,
there's something I still don't understand about using custom entities to
store your data. If you bind a grid to your business object, how do you
know which rows get changed? How do you know which rows need to be added
and/or deleted?

Also, do you have to pass your business objects through a web service (I
do)? Web services aren't great for passing custom types like that...

Thanks,

Mike
 
Michael said:
Hi Stu,

I've read those benchmarks and it did give me some concern. However,
there's something I still don't understand about using custom entities to
store your data. If you bind a grid to your business object, how do you
know which rows get changed? How do you know which rows need to be added
and/or deleted?
All that should be catered for in IBindingList.
I havent actually implemented deletion, but have done add's.
Also, do you have to pass your business objects through a web service (I
do)? Web services aren't great for passing custom types like that...
Im not sure

<snip)

JB
 
Hi Mike,

I think it really depends on the context of the problem. Generally I use a
mix of business objects and datasets, I find that custom entities give me a
lot more control, you can use them as a collection and there are a number of
interfaces that will allow you more control over binding, etc.

I also use datasets when a large quantity of data is moving from my Data
Access Layer to my User Interface Layer relatively unchanged, but I would
usually implement caching in that case. Unfortunately I have not had any
experience with web services, but you can easily serialize business objects
so I imagine it can be done.

Hope this helps.

- Stu
 
Stu,

It's true that business objects can be serialized and passed through a web
service. The problem is the default proxy class does not know about your
custom type. This means every time you update your web reference, you have
to go into the Reference.cs file and add a reference to your custom type.
The was definitely the case for .NET 1.1. I haven't verified yet whether or
not that has changed for 2.0.

Mike
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top