Thursday 26 November 2009

Modeling for Smart Ecosystem Architecture

In a new CBDI Journal report this month I have been looking at the modeling requirements to support our Smart Ecosystem Architecture (SEA) concept.

It is evident there are two major architectural patterns in play – service oriented and event driven. But in addition we need to recognize a new dynamic which we might characterize as “post enterprise” including cloud computing, Web 2.0 and so called smart IT that encourages ecosystem collaborations over internal enterprise processes.

There is a lot of superficial (marketing) noise suggesting a close relationship between Event Driven Architecture (EDA) and SOA, Actually the two domains are in practice islands of automation. SOA is much more widely used, whereas EDA is still largely restricted to narrow focus, point solutions. This is changing, but slowly as there are considerable complexities to deal with in an integrated world. Not least event data availability, architecture contention and difficulties in testing. Of course the solution to broader utilization is integrated modeling approaches.

Similarly cloud, Web 2.0 and smart IT are separate domains and narrowly focused and capturing the requirements for either still tends to exist in two separate domains. However, to recognize the real opportunities in the smart ecosystem requires us to take a more coordinated view of modeling that spans all of these considerations.

CBDI has a vast amount of business modeling guidance and advice; much of it has pioneered thinking around modeling services, events, capabilities, meta data, dynamic business intelligence and ecosystems. See for example the CBDI report series on Business Modeling for SOA that begins specifically with the event-response concept at a high level as part of business modeling , and the report Event Driven Service Architecture that examines the convergence of EDA and SOA.

In this new report Modeling for Smart Ecosystem Architecture I advise on how to integrate these different perspectives across the broader set of architectural views. We also explore the dual requirements to consider both events and services, and also consider some of the meta model impacts.

On that last point it is interesting to note the lack of any 'standard' meta model for EDA, and hence similarly the EDA/SOA relationship. Whilst OMG have been working on a RFP for an ‘Event Model and Profile’ (EMP), this has yet to be issued.
Today, the requirements are quite likely to be captured directly into a format prescribed by proprietary event management products used for the EDA implementation.

We are currently evaluating how we add the necessary concepts to support EDA and SOA convergence in the SAE Meta Model and our associated UML profile, and discussing how we may contribute to emerging standards activity in the same way as we have done with SoaML.

Tuesday 24 November 2009

The Shape of Business - Drivers for Smart Ecosystems

A report this week from the Confederation of British Industry (CBI) entitled The Shape of Business – The Next 10 Years provides some useful insight into emerging business drivers that reinforce our concepts of smart ecosystems.

Taking at some of the key headlines in the report, you can see the increasing need for organisations to develop ecosystems, and adding smart behaviour.

Movement to a more collaborative business model. The recession has made businesses much more aware of the complexities and interdependencies in their operations, their financing, supply chains and customers, but they are still not able to fully assess or capture these in their business planning. To gain greater control of these uncertainties, businesses will seek to ‘simplify’ their operations and will enter into more partnerships and joint ventures. In particular, this will be important for businesses moving to a ‘core plus periphery’ model

= a need to develop ecosystems
= the need for a smarter supply chain, hence smart ecosystem. Though supply chain optimization is hardly new, the emphasis on this is clearly going to grow. There are also additional ways of thinking about the supply chain – e.g. as a source of finance, not just ‘goods’. i.e. don’t invest in stock, and certainly don’t lend money to buy stock, but rely more on the support of participants in the supply chain and on its optimisation to lower inventory and provide 'just in time' fulfilment.

Rationalization to the Core. The recession accelerated the need to address inefficiencies and non-core activities across the enterprise. It has also provided the stimulus for companies to re-think themselves and re-evaluate their future.

= a need for greater collaboration across an ecosystem consisting of an internal core and an external periphery (what Moore might term context)

Technology will enable new ways of working. Businesses will increase their use of social networking techniques to solve problems – many more companies will use Facebook, Twitter and other web 2.0 developments

= making sense out of what is going on in complex social networks is going to require a lot of ‘smart’. It isn’t going to save money if it requires an army of people to monitor and participate in social networks

There were also some interesting comments that identify the need for agility.

For example, there is a current trend of “localism” in some organisations or industry sectors, relocating certain supply chain activities back to the UK. What is evident is that the decisions about what should be done locally or globally, or what should be in-house or outsourced, will be fluid. It is not a one-off decision but one of constant re-evaluation in the face of prevailing conditions. Equally, large organisations may find that the decision varies across different product lines, with ‘no one size fits all’ solution. As such, organisations will need to be very agile if they are to quickly capitalise on changes in those conditions.

Or course it is very difficult to predict what will happen across the next 10 years. How many predicted the current situation a decade ago?

And there is of course the need to survive 2012 first!

Sunday 15 November 2009

Time to Eat the Programmer?

One of my colleagues asked the question of whether the work that we do (at Everware-CBDI) could ride the ‘green’ bandwagon. After all he suggested, some of our key services in helping customers to transform their existing systems, and reuse software components and software services via CBD and SOA, might be considered ‘green’.

My first reaction was “that’s a bit of a stretch”. However, some of the news recently that has accompanied the publication of the book “Time to Eat the Dog” by Robert and Brenda Vale highlights just how complex the ‘green’ issue is.

For example, the authors point out that the ecological footprint of a keeping a medium sized dog as a pet is in fact greater than that of driving a large SUV 10,000km a year. This presents a double ecological challenge for my colleague who raised the question as he has both a Labrador and a SUV.

It struck me therefore that if those authors can determine the effect each pet has on the environment, then similarly we ought to be able to determine the ecological impact of each line of code produced by a programmer.

I don’t intend to do so here (determining the equation would itself be a waste of valuable resources – well my time at least). But it does illustrate just how complex the issues of all things ‘green’ truly are.

Most ‘green’ IT efforts today are focused on reducing energy consumption by using more efficient computers. Similarly, many of the ‘green’ messages focused on the population at large are also focused on reducing energy consumption by using more efficient means of transport.

But if as demonstrated by the ecological impact of pets, we need to consider much more complex factors in order to truly establish our impact on the environment, then I guess it is just as valid to ask not just how much electricity does a CPU use, but also what is the ecological impact of programming – or other IT development activities.

Perhaps organizations that are serious about ‘green’ IT ought to better consider the options of software ‘recycling’ before they simply create yet another new system. No longer is it just an economic decision of whether it is cheaper to recycle components of an existing system or build new one, now it is also an ecological decision too.

Yes, it is still a “bit of a stretch”. But nevertheless, not perhaps quite as extraneous as you might have first thought.

(you can read more about “Time to Eat the Dog” via New Scientist, BBC News, Guardian Newspaper)

Tuesday 3 November 2009

Using Service Component Architecture (SCA) and SoaML

Service Component Architecture (SCA) brings the formality of SOA contracts and end point separation into the implementation layer. For the many organizations undertaking modernization efforts the standardization of componentization and SOA enablement is an important issue. Whilst its usage may not be widespread, it is likely that growing demand for rationalization of the implementation layer will lead to increased adoption.

In a new report I look at SCA and consider how it relates and maps to SoaML and provide guidance for integration into the broader SAE/SoaML picture.

My conclusion is that in some respects the SCA and SoaML initiatives represent something of a missed opportunity. Though the respective groups will claim they address different requirements or audience, the reality is that it is hard to see why they couldn’t have been based on common concepts where they overlap. One could have extended the other instead of re-inventing. Then the inevitable additional effort and loss of integrity through the mapping and transformations from one to the other discussed in this report could have been avoided. More so given that some organizations participated in both initiatives.

On the positive side, SoaML to SCA transformation tools do exist, and the mapping provided between SCA, SoaML, and CBDI-SAE in the report is at least at a high level relatively straightforward.

SCA is an elegant solution, but somewhat limited by lifecycle scope, and dependence on the SCA runtime, in comparison SoaML. However, it does offer portability across SCA compliant platforms and that is of significant benefit.

SCA does seem to have reached a bit of an impasse. It is there, it works, but it isn’t clear whether adoption is growing, either by end-users or support by more vendors. If you are using or planning on using SCA, please let us know in our CBDI LinkedIn where we have created a discussion topic

An Update to the Example Model based on Version 3 of the CBDI-SAE Meta Model

The October 2009 CBDI Journal contains a follow-up to the previous article presenting the draft version of SAE Meta Model V3. In this new report John Butler provide an update to the UML Profile for the CBDI-SAE Meta Model V3 focusing on the core areas and those that illustrate alignment with SoaML. Given that worked examples are the best way to understand a meta model John has updated the example model based on the fictional company Springfield Parcels, Inc. This should allow readers the opportunity to compare and contrast the version 2 meta model with that of version 3.

Monday 2 November 2009

SOA - From Vision to Practice

I have just been reading the September 2009 issue of the Microsoft Architecture Journal which is focused on SOA. The foreword starts off by referring back to the report 'Understanding SOA' that David Sprott and I wrote for the very first issue back in 2004. As editor Diego Dagum puts it, "the main difference with the article that we published in the early days is that, this time, thoughts have emerged as a consequence of a practice; in 2004, thoughts had emerged as a consequence of a vision."

The inevitable "SOA is Dead" discussion is raised in the report Model- Driven SOA with"Oslo". But that report and others in the journal really serve as reminders that SOA is in reality still in its infancy, and we are still developing best practice, and in Microsoft's case at least, the modeling tools to support it.

Encapsulating best practice in modeling tools is essential if we are to

  • move beyond perceptions by some that "SOA is a technology"
  • demonstrate business value by making the SOA investment more relevant and 'visible' to business sponsors
That's a key reason why we ourselves have invested so much in our CBDI-SAE Meta Model for SOA and the associated UML profile that enables users to move from business models to implementation in a consistent and traceable manner.

It would be interesting to see how we might implement that in Microsoft Oslo. Oslo offers the promise of "the model is the code". Not surprisingly, Microsoft do have a slightly myopic view of the world that typically extends only as far as their own technologies. Though it may not be a business modeling language, Service Component Architecture (SCA) that I have been looking at in more detail this month does already provide a model-driven approach to modeling the Service Implementation Architecture, which is a good level of abstraction at which to consider "the model is the code". Unfortunately, SCA isn't implemented on the Microsoft platform. It's a pity that wheel is going to be reinvented I guess.

Nevertheless, it is good to see Microsoft continue to develop the SOA vision and turn it into practice, and in particular to embed that vision deep into their platform and tools, such that SOA becomes just 'business as usual'.