9
votes

I designed the data access portion of our framework so that every time a business object (BO) needs to interact with the database, it would have to open a connection, invoke the data access layer (to execute the query), and then close the connection. Then if it needed to run in a transaction, it would open the connection, begin the transaction, invoke the data access layer (to execute the query) and then commit the transaction, close the transaction, and finally close the connection.

I did it this way in the mindset of "open late, close early"… but what if I needed to call other BOs to submit data in a single transaction? Is there a better way to handle opening and closing connections as well as working with transactions?

I'm a rookie in designing application architecture, so I hope I'm not doing this wrongly… any help is appreciated.

5

5 Answers

5
votes

If a given business object needs to execute various methods in a transaction, use a TransactionScope like so:

using ( var transactionScope = new TransactionScope() )
{
    this.Save();
    childObjA.Save();
    childObjB.Save();
    childObjC.Save();
    childObjD.Save();

    transactionScope.Complete();
}

If any of the objects throws an exception, it will rollback the transaction.

See the MSDN reference page for TransactionScope for more.

4
votes

When higher-level abstractions depend on lower-level abstractions (such as business logic classes depending on data connections), it's common to supply the lower-level abstractions through the constructor. The technique is called constructor injection:

public class OrderService
{
    private SqlConnection connection;

    public OrderService(SqlConnection connection)
    {
        if (connection == null)
            throw new ArgumentNullException("connection");
        this.connection = connection;
    }

    // Other methods
}

This then allows you to write code against the services similar to the following:

using (TransactionScope tsc = new TransactionScope())
using (SqlConnection connection = new SqlConnection(...))
{
    connection.Open();
    OrderService os = new OrderService(connection);
    os.ProcessOrder(myOrder);
    ShippingService ss = new ShippingService(connection);
    ss.ShipOrder(myOrder);
    tsc.Complete();
}

Which is most likely going to be what you want, in the end - the ability to share one connection among many services.

This also helps to decouple your services from the implementation details of the data connection. That way, if you want to do something like change the connection settings under certain circumstances, you don't have to dig into the details of 50 different services, you only have to change the one line of code that creates the connection.

One more thing: If you're going to use the TransactionScope, make sure to add Transaction Binding=Explicit Unbind to the connection string, otherwise it's actually possible to end up with inconsistent data if a transaction times out.

2
votes

As mentioned by others, TransactionScope is the way to go.

If you are using SQL Server 2008 and .NET 3.5, I would modify the design to have the business object control the transaction and leave the opening and closing of the connection to the data layer.

With connection pooling on, you will not actually be incurring the overhead of opening a physical database connection and your connections will only be open when performing actual work. Since (I assumed) you have SQL Server 2008 with .NET 3.5 your transaction will not escalate to a distributed transaction (unless you open multiple connections at the same time) so you get the best of both worlds.

Then you could write your business object like this:

using (TransactionScope transactionScope = new TransactionScope())
{
    DataObject dataObject = new DataObject();
    dataObject.UpdateQuantity(...);

    ShippingManager shippingManager = new ShippingManager();
    shippingManager.ShipOrder(...);

    transactionScope.Complete()
}

This avoids having to pass connection strings around to all business objects and makes coordinating transactions easy.

Update

The beauty of System.Transactions is that all of the transactions are managed for you irrespective of the connection that you are using. You just declare a TransactionScope and all database access within that TransactionScope will occur withing a single transaction (unless you request otherwise with different TransactionScope settings).

In the past (SQL Server 2005 .NET 2.0), if you opened and closed a connection and then opened and closed another connection (even with the same connection string) then the transaction was promoted from a Lightweight Transaction to a Distributed Transaction. This was undesirable because performance suffers (communication to MSDTC is out of process and the two phase commit protocol) and MSDTC can be a pain to configure in many production environments (firewalls and security).

With SQL Server 2008 and .NET 3.5 they have added the ability to avoid this promotion when opening and closing multiple connections with the same connection string within a single transaction. For a really good explanation of what they did see Extending Lightweight Transactions in SqlClient.

Update 2

Transactions with Oracle 10g will function properly with TransactionScope. And it looks like ODP.NET supports Lightweight Transactions (which is nice). Unfortunately, I think that the promotion to a distributed transaction will occur with the closing and opening of connections.

If you wish to avoid a distributed transaction you could pass the connection to every method call/Business Object. If you don't want to pass a connection around, you could use a ConnectionScope class which keeps the connection open on the thread. An alternative to that would be to use Enterprise Library 3.0 (and above) Data Access Application Block. The Data Access Block can detect that a transaction is in progress and use the same connection to avoid a distributed transaction.

2
votes

Sounds like you've got the right idea. If multiple BOs need to be involved, then one of them needs to be a "controller" — it should open and close the connection, and pass it to the others. Or some "wrapper" object could handle the connection and pass it to each of the BOs. Your BOs may need to be designed to operate both on their own (handle their own connection), and to accept an existing connection from outside.

1
votes

You are probably looking for the Unit of Work pattern and Registry pattern. These two patterns can work in concert to separate the concerns of finding business objects and tracking them for later commit to your data store as a transaction.

I would also look into Object Relational Mapping, or ORM. ORM is a higher level composition of the unit of work, registry, persistence ignorance and other patterns that provides a very clean separation of your business logic from your persistence logic. Using and ORM, you can generally eliminate the need to write stored procedures, build a custom DAL, etc. The ORM takes care of your persistence concerns for you, allowing you to focus on the business logic that needs to be done.

Since you are using C# and .NET, I would look into Entity Framework (v4, don't use v1), or LINQ to SQL. Both are OR mappers that come with the .NET framework from v3.5 and onwards. LINQ to SQL is a very simple and well-tooled ORM that should get you going very quickly. Entity Framework is a much richer ORM that is also very well tooled (better than LINQ to SQL), and offers considerably more functionality. There are also third-party ORM's that can do the job, including a free one called NHibernate. While it is not as well tooled as the Microsoft ORM's, NHibernate is a very mature open source ORM with a large community following.

If an ORM is not a possibility, then I would look into Unit of Work, Registry (or Repository), Persistence Ignorance, Separation of Concerns, Single Responsibility, and other related patterns.