Sep 18th, 2011, 11:15 AM
Complex architecture for simple application
As a consultant, I am responsible for designing the architecture of an application for an external company. The requirements for this application are rather simple and the whole thing could easily be solved with a basic web application, one or two incoming web services and a few outgoing document channels.
Things get more complicated because of two non-functional requirements:
1. Said company mandates that all internal applications be offered through an enterprise portal (for UI, security and technical uniformity)
2. Said company mandates that all applications be built using SOA principles so that services may be eventually published on an ESB and reused.
Architecture can be adapted to the portal requirement easily. Presentation will be built using portlets and integrate within the portal theme, and portal security will be reused. No big deal.
The SOA requirement is another story. Reusable services have not yet been identified. The way I see it, there are a few options:
1. Business logic is deployed on the portal and co-located with the presentation layer. No services are exposed and this decision is deferred.
2. Business logic is deployed on a separate server. An API is designed and all services are exposed using a closed protocol (e.g. RMI or Hessian). For services that need to be eventually reused, a SOAP API may be added on top of these services.
3. Business logic is deployed on a separate server. A SOAP API is designed and all services are exposed using this mechanism.
I want to avoid building something too complicated. I have lived through projects with business delegates, remote facades and DTOs where every single change required modifying several layers. Yet, it feels as if this SOA requirement forcibly pushes me in that direction.
Update: The more I think about it, the more I realize that complexity arises from the need to design a remoting API. Of course this requires creating interfaces for services, but what about the exchanged entities? Either I go the DTO way and end up with two parallels object hierarchies (one for DTOs and one for the actual entities) or I go the interface way and declare interfaces for all entities that need to transit across servers. Either way, this brings up a whole new set of problems and we will end up writing lots of boiler-plate code. And I thought we were through with that era...
What would be the best (or least worse) way to design this?
Last edited by spiff; Sep 18th, 2011 at 12:08 PM.
Sep 20th, 2011, 12:40 AM
Given the non functional requirements, building loosely coupled business services should be an important objective. Given this I would recommend XML based services built using Spring's contract first approach. This top down approach will ensure that the service are really business services. This mostly is not the case when we expose java classes as web services. Also using Spring means that you could shift from SOAP to POX (pure old XML) without code changes. These services could be either consumed/used by portal server & at a later stage could be exposed via ESB with no/minimal changes.
Hessian, HttpInvoker, etc. are remoting protocols, but are not loosely coupled. Any chnage to the interface will have an impact on the client using the interface.
The portal server & the business services can be colocated OR can be located on the different server. My recommendation would be to co-locate them as keeping them on seperate servers would add performance overhead.
Since XML is being used there will be no need for DTO's although you will need to populate domain objects from XML & vice versa. Spring Web Services & POX has support for XML marshalling frameworks which will make doing this easier.
Using XML based approach vis-a-vis Hessian, etc. might have a negative impact on performance, however I don't expect this to be substantial. Moreover at a slight hit to performance you will be able to address the non functional requirements.