Showing posts with label Model driven development. Show all posts
Showing posts with label Model driven development. Show all posts

Tuesday, April 21, 2009

Oracle's Java

Normally when I refer to 'Oracle' on this blog, I refer to 'Enterprise IT Oracle'. This time I am however referring to Larry Ellison's Oracle Corporation on this blog. It is because its acquisition of one of the significant technologies of 21st century, viz. Java programming language and associated paraphernalia.

This is BIG. Large chunk of enterprise IT is done these days, using Java technology and it is assumed to be 'De facto' and 'open' standard. Now with Oracle's acquisition of Sun, the openness of Java is under question.

What should organisations with large Java investments do? Well IBM is one of those, with large Java investments. Most of IBM's Internet stack under WebSphere brand is based on Java. What would IBM do?

What would other hardware vendors like HP, Dell do? Do we see three hardware blocks emerging? Sun/Oracle, HP/Microsoft and IBM? Where would Cisco/Google servers fit in this scheme of things?

Continuing this analogy would we see three software blocks viz, Java, .Net and possibly IBM's own Java like technology (may be a rehashed EGL with its own byte code) emerging?

Will cloud providers be bound to one of these blocks? If these three blocks start erecting Chinese walls around themselves, in order to lock customers in, then how does one allow seamless movement from one 'cloud' provider to another?

What would be de-risking strategy for enterprises, to avoid a lock-in within one of these blocks, even in an on-premise scenario? Does MDA sound attractive now? Is it the time that enterprises start focusing on separating their intellectual property from execution platforms?

Interesting times. Time to activate your enterprise IT Oracle and see which parts of Enterprise Architecture needs re-looking.

Friday, February 27, 2009

Of TLAs and FLAs

In my previous organisation every employee used to have a three letter acronym (TLA) made from employee first, middle and last name, instead of an employee number. There was an anecdote about it as well. The company's then chief, who is also called "Father of the Indian Software Industry" had actually ordered a four letter acronym (FLA). But the software that was used to generate those acronyms, produced a nasty one for the chief. (If you know his name, you can imagine what that four letter word would have been). So he ordered it to be changed to a three letter one. (BTW, Many happy returns of the day Mr. Kohli).

It appears to be going in reverse within IT. In IT, there were a lot of TLAs. ERP, SCM, CRM, EAI, BPM and SOA to name a few you may have come across. Of late however the trend is moving towards FLAs, what with SaaS, PaaS and so on. However the most enduring acronym which survived the test of time is neither a three letter one nor a four letter one. It is actually a five letter one - RDBMS.

The reason it has survived for this long, is because it is more than an acronym. It is a well thought out piece of technology backed by solid science. It is not just an acronym coined by sales and marketing guys nor an industry analyst. This technology has proven to be easily standardised, exetensible and serving the vast array of requirements some of which was not even envisaged when technology was initially developed.

Sadly same cannot be said of all the technologies you see around these days. Many of them are rehash of old ideas in new packaging. They rely on finding the right audience at right time to proliferate and thrive on inherent problems they carry to generate more revenues for their owners.

Enterprise architects need to possess the vision to see thru the marketing fluff and reach the bare bones of technologies they are going to employ. Analysts can help to an extent, but the generic analysis may not be completely applicable in your situation. You need to equip your 'Oracle' function with this capability.

Sunday, December 02, 2007

Model driven development

Todd has asked a few question about using models during development of software. Though his questions are from Business Process Model perspective, they apply generally to model driven development as a whole.

Since I have considerable experience in this area, I would like to comment.

In my opinion, modeling does not negate the need for continuous integration nor testing. Unless one can prove models to be correct with respect to requirements using theorem provers or similar technologies, testing is a must. (Writing those verifiable requirements will take you ages, though). And one does need to define appropriate unit in model driven development world for large enterprise class developments, to allow for parallel development. Continuous integration is one of the best practices one would not want to lose when multiple units are involved.

We had defined a full fledged model driven development methodology, with an elaborationist strategy, for developing enterprise class components. We modeled data and behaviour of a component as an object model and then it was elaborated in terms of business logic and rules, before being converted into deployable artifacts. We did it this way because business logic and rules were considered too procedural to be abstracted in terms of any usable modeling notation, but that has no bearing on discussion that follows. The methodology allowed for continuous integration during all phases of development. We had defined component as a unit for build and test. These units could be version controlled and tested as units. Since it was the complete software development methodology same models were refined from early analysis to late deployment. Component as a unit however made sense only during build and test phases. For requirement analysis and high level designs different kinds of units were required. This is because during analysis and design different roles access these artifacts and their needs are different than people who build and test.

Lesson 1: Units may differ during different phases of life cycles. This problem is unique to model driven techniques, because in non-model driven world there is no single unit which goes across all phases of life cycle. If you are using iterative methods this problem becomes even trickier to handle.

We found that models have a greater need for completeness than source code and cyclical dependencies cause problems. That is, equivalent of a 'forward declaration' is very difficult to define in model world, unless you are open to break the meta models. (e.g.) A class cannot have attributes without its data type being defined. And that data type being a class depending on first class to be ready. I am sure similar situation will arise in business process modeling too. This had a great implication on continuous integration, because these dependencies across units would lock everything in a synchronous step. It is good from quality perspective but is not very pragmatic. We had to devise something similar to 'forward declaration' for models. I think I can generalise this and say that it will apply to all model driven development which follows continuous integration.

We had our own configuration management repository for models. But one could use standard source control repository, provided tool vendor allows you to store modeling artifacts in a plain text format. (well some source code tools are tolerant of binary files as well, but you can't do 'diff' and 'merge'). Devising a proper granularity is tricky and point above should be kept in mind. Some tools inter operate well with each other and provide nice experience (e.g. Rational family of tools). Then your configuration management tools can help you do a meaningful 'diff' and 'merge' on models too.

Lesson 2: Appropriate configuration control tool is needed even in model driven development

Need for regression testing was higher because of point above. Every change would be rippled to every other part that is connected with it, marking it as changed. Traditional methods would then blindly mark all those artifacts for regression testing. Again it was good from quality perspective, not very pragmatic though. We had to make some changes in change management and testing strategy to make it optimal.

Lesson 3: Units need to be defined carefully to handle trade off between parallelism and testing effort during build phase.

In short model driven methods tend to replicate software development methodology that is used without models. Models provide a way to focus on key abstractions and not get distracted by all the 'noise' (for want of better word) that goes with working software. That 'noise' itself can be modeled and injected into your models, as cross cutting concerns. In fact based on my experience with this heavy-weight model driven approach, I came up with a lighter approach called 'Code is the model'. Which can even be generalised to 'Text Specification is the model' and this code v/s model dichotomy can be removed as far as software development methodology goes.

Now a days some modeling tools have their own run time platforms, so models execute directly on that platform. This would avoid a build step. But defining a usable and practical configurable unit is a must. Then defining a versioning policy for this unit and defining a unit & regression testing strategy cannot be avoided. When multiple such modeling tools with their own run time platforms are used, it would provide its own set of challenges in defining testable and configurable units. But that's a topic for another discussion!
Showing posts with label Model driven development. Show all posts
Showing posts with label Model driven development. Show all posts

Tuesday, April 21, 2009

Oracle's Java

Normally when I refer to 'Oracle' on this blog, I refer to 'Enterprise IT Oracle'. This time I am however referring to Larry Ellison's Oracle Corporation on this blog. It is because its acquisition of one of the significant technologies of 21st century, viz. Java programming language and associated paraphernalia.

This is BIG. Large chunk of enterprise IT is done these days, using Java technology and it is assumed to be 'De facto' and 'open' standard. Now with Oracle's acquisition of Sun, the openness of Java is under question.

What should organisations with large Java investments do? Well IBM is one of those, with large Java investments. Most of IBM's Internet stack under WebSphere brand is based on Java. What would IBM do?

What would other hardware vendors like HP, Dell do? Do we see three hardware blocks emerging? Sun/Oracle, HP/Microsoft and IBM? Where would Cisco/Google servers fit in this scheme of things?

Continuing this analogy would we see three software blocks viz, Java, .Net and possibly IBM's own Java like technology (may be a rehashed EGL with its own byte code) emerging?

Will cloud providers be bound to one of these blocks? If these three blocks start erecting Chinese walls around themselves, in order to lock customers in, then how does one allow seamless movement from one 'cloud' provider to another?

What would be de-risking strategy for enterprises, to avoid a lock-in within one of these blocks, even in an on-premise scenario? Does MDA sound attractive now? Is it the time that enterprises start focusing on separating their intellectual property from execution platforms?

Interesting times. Time to activate your enterprise IT Oracle and see which parts of Enterprise Architecture needs re-looking.

Friday, February 27, 2009

Of TLAs and FLAs

In my previous organisation every employee used to have a three letter acronym (TLA) made from employee first, middle and last name, instead of an employee number. There was an anecdote about it as well. The company's then chief, who is also called "Father of the Indian Software Industry" had actually ordered a four letter acronym (FLA). But the software that was used to generate those acronyms, produced a nasty one for the chief. (If you know his name, you can imagine what that four letter word would have been). So he ordered it to be changed to a three letter one. (BTW, Many happy returns of the day Mr. Kohli).

It appears to be going in reverse within IT. In IT, there were a lot of TLAs. ERP, SCM, CRM, EAI, BPM and SOA to name a few you may have come across. Of late however the trend is moving towards FLAs, what with SaaS, PaaS and so on. However the most enduring acronym which survived the test of time is neither a three letter one nor a four letter one. It is actually a five letter one - RDBMS.

The reason it has survived for this long, is because it is more than an acronym. It is a well thought out piece of technology backed by solid science. It is not just an acronym coined by sales and marketing guys nor an industry analyst. This technology has proven to be easily standardised, exetensible and serving the vast array of requirements some of which was not even envisaged when technology was initially developed.

Sadly same cannot be said of all the technologies you see around these days. Many of them are rehash of old ideas in new packaging. They rely on finding the right audience at right time to proliferate and thrive on inherent problems they carry to generate more revenues for their owners.

Enterprise architects need to possess the vision to see thru the marketing fluff and reach the bare bones of technologies they are going to employ. Analysts can help to an extent, but the generic analysis may not be completely applicable in your situation. You need to equip your 'Oracle' function with this capability.

Sunday, December 02, 2007

Model driven development

Todd has asked a few question about using models during development of software. Though his questions are from Business Process Model perspective, they apply generally to model driven development as a whole.

Since I have considerable experience in this area, I would like to comment.

In my opinion, modeling does not negate the need for continuous integration nor testing. Unless one can prove models to be correct with respect to requirements using theorem provers or similar technologies, testing is a must. (Writing those verifiable requirements will take you ages, though). And one does need to define appropriate unit in model driven development world for large enterprise class developments, to allow for parallel development. Continuous integration is one of the best practices one would not want to lose when multiple units are involved.

We had defined a full fledged model driven development methodology, with an elaborationist strategy, for developing enterprise class components. We modeled data and behaviour of a component as an object model and then it was elaborated in terms of business logic and rules, before being converted into deployable artifacts. We did it this way because business logic and rules were considered too procedural to be abstracted in terms of any usable modeling notation, but that has no bearing on discussion that follows. The methodology allowed for continuous integration during all phases of development. We had defined component as a unit for build and test. These units could be version controlled and tested as units. Since it was the complete software development methodology same models were refined from early analysis to late deployment. Component as a unit however made sense only during build and test phases. For requirement analysis and high level designs different kinds of units were required. This is because during analysis and design different roles access these artifacts and their needs are different than people who build and test.

Lesson 1: Units may differ during different phases of life cycles. This problem is unique to model driven techniques, because in non-model driven world there is no single unit which goes across all phases of life cycle. If you are using iterative methods this problem becomes even trickier to handle.

We found that models have a greater need for completeness than source code and cyclical dependencies cause problems. That is, equivalent of a 'forward declaration' is very difficult to define in model world, unless you are open to break the meta models. (e.g.) A class cannot have attributes without its data type being defined. And that data type being a class depending on first class to be ready. I am sure similar situation will arise in business process modeling too. This had a great implication on continuous integration, because these dependencies across units would lock everything in a synchronous step. It is good from quality perspective but is not very pragmatic. We had to devise something similar to 'forward declaration' for models. I think I can generalise this and say that it will apply to all model driven development which follows continuous integration.

We had our own configuration management repository for models. But one could use standard source control repository, provided tool vendor allows you to store modeling artifacts in a plain text format. (well some source code tools are tolerant of binary files as well, but you can't do 'diff' and 'merge'). Devising a proper granularity is tricky and point above should be kept in mind. Some tools inter operate well with each other and provide nice experience (e.g. Rational family of tools). Then your configuration management tools can help you do a meaningful 'diff' and 'merge' on models too.

Lesson 2: Appropriate configuration control tool is needed even in model driven development

Need for regression testing was higher because of point above. Every change would be rippled to every other part that is connected with it, marking it as changed. Traditional methods would then blindly mark all those artifacts for regression testing. Again it was good from quality perspective, not very pragmatic though. We had to make some changes in change management and testing strategy to make it optimal.

Lesson 3: Units need to be defined carefully to handle trade off between parallelism and testing effort during build phase.

In short model driven methods tend to replicate software development methodology that is used without models. Models provide a way to focus on key abstractions and not get distracted by all the 'noise' (for want of better word) that goes with working software. That 'noise' itself can be modeled and injected into your models, as cross cutting concerns. In fact based on my experience with this heavy-weight model driven approach, I came up with a lighter approach called 'Code is the model'. Which can even be generalised to 'Text Specification is the model' and this code v/s model dichotomy can be removed as far as software development methodology goes.

Now a days some modeling tools have their own run time platforms, so models execute directly on that platform. This would avoid a build step. But defining a usable and practical configurable unit is a must. Then defining a versioning policy for this unit and defining a unit & regression testing strategy cannot be avoided. When multiple such modeling tools with their own run time platforms are used, it would provide its own set of challenges in defining testable and configurable units. But that's a topic for another discussion!