In my previous organisation every employee used to have a three letter acronym (TLA) made from employee first, middle and last name, instead of an employee number. There was an anecdote about it as well. The company's then chief, who is also called "Father of the Indian Software Industry" had actually ordered a four letter acronym (FLA). But the software that was used to generate those acronyms, produced a nasty one for the chief. (If you know his name, you can imagine what that four letter word would have been). So he ordered it to be changed to a three letter one. (BTW, Many happy returns of the day Mr. Kohli).
It appears to be going in reverse within IT. In IT, there were a lot of TLAs. ERP, SCM, CRM, EAI, BPM and SOA to name a few you may have come across. Of late however the trend is moving towards FLAs, what with SaaS, PaaS and so on. However the most enduring acronym which survived the test of time is neither a three letter one nor a four letter one. It is actually a five letter one - RDBMS.
The reason it has survived for this long, is because it is more than an acronym. It is a well thought out piece of technology backed by solid science. It is not just an acronym coined by sales and marketing guys nor an industry analyst. This technology has proven to be easily standardised, exetensible and serving the vast array of requirements some of which was not even envisaged when technology was initially developed.
Sadly same cannot be said of all the technologies you see around these days. Many of them are rehash of old ideas in new packaging. They rely on finding the right audience at right time to proliferate and thrive on inherent problems they carry to generate more revenues for their owners.
Enterprise architects need to possess the vision to see thru the marketing fluff and reach the bare bones of technologies they are going to employ. Analysts can help to an extent, but the generic analysis may not be completely applicable in your situation. You need to equip your 'Oracle' function with this capability.
Friday, February 27, 2009
Friday, February 13, 2009
IBM in cloud.
I read about Nick Carr commenting on IBM putting their infrastructure software in Amazon EC2. He is comparing it with IBM's disastrous decision to leave IP rights of Microsoft-Dos with, well, MicroSoft. Is this recent decision really comparable?
In case of PC, in hind sight it appears that IBM had erroneously assumed that their micro-processor based PC was non commoditisable, whereas Microsoft correctly judged that anyone could assemble a PC using off-the-shelf microprocessors from Intel. So Microsoft retained IP on software which it could use to monopolise the commodity PC market.
So the question in this scenario is what is likely to be commoditised? Are EC2 services that unique that no-one else could replicate? What is the barrier to entry?
Certainly not technology. If my memory serves me right, IBM itself is big daddy of virtualisation and what it calls utility computing. It had developed first hypervisor called VM CP, which used to run on S/xxx mainframes. In recent times, I remember using IBM provided Linux image on a mainframe in 2001 to do some personal project (I was trying to do some code visualisation into model using open source tools on IBM provided Linux image.) IBM had provisioned that image in 24 hours.
So what is it? My guess is IBM is using EC2 to hook users onto its IP and then move them to a different Cloud (possibly its own). I think IBM is still not worked out the business model around its own cloud offering for SMB. Not a bad thing. It will surely standardise cloud computing space and make enterprise scale software stack available even to SMBs. It might hasten uptake of private cloud offerings too. So everyone will benefit. It would be interesting to see how this works out in next few years. I am definitely going to watch the progress.
In case of PC, in hind sight it appears that IBM had erroneously assumed that their micro-processor based PC was non commoditisable, whereas Microsoft correctly judged that anyone could assemble a PC using off-the-shelf microprocessors from Intel. So Microsoft retained IP on software which it could use to monopolise the commodity PC market.
So the question in this scenario is what is likely to be commoditised? Are EC2 services that unique that no-one else could replicate? What is the barrier to entry?
Certainly not technology. If my memory serves me right, IBM itself is big daddy of virtualisation and what it calls utility computing. It had developed first hypervisor called VM CP, which used to run on S/xxx mainframes. In recent times, I remember using IBM provided Linux image on a mainframe in 2001 to do some personal project (I was trying to do some code visualisation into model using open source tools on IBM provided Linux image.) IBM had provisioned that image in 24 hours.
So what is it? My guess is IBM is using EC2 to hook users onto its IP and then move them to a different Cloud (possibly its own). I think IBM is still not worked out the business model around its own cloud offering for SMB. Not a bad thing. It will surely standardise cloud computing space and make enterprise scale software stack available even to SMBs. It might hasten uptake of private cloud offerings too. So everyone will benefit. It would be interesting to see how this works out in next few years. I am definitely going to watch the progress.
Thursday, February 12, 2009
Darwin and enterprise architecture
Today is 200th birth anniversary one of the great thinkers of modern times. Darwin presented us with the theory of evolution, summed up as 'natural selection' or 'survival of the fittest'. We also know that applying Darwinism in all spheres (e.g. social sphere - a la Nietzsche/Hitler) is not a good idea. However thats what tends to happen when it comes to enterprise IT. The evolution of enterprise IT tends to follow, the evolution of enterprise itself. As enterprises evolve, different parts of its value chain become important and get more attention (consequently more resource allocation). This weird sort of natural selection leads to multiplicity of systems and infrastructure, which tend to overlap or in rare cases leave gaps.
Over a period of time, the architectural environment, may it be business, application or technology architecture gets contaminated. This leads to bloated cost base and starts affecting time to market for business change. Approaches such as portfolio rationalisation can restore order temporarily. But to retain some semblance of order all the time, a proper enterprise architecture function must oversee the enterprise IT.
However what I have seen happening in practice is that at best of times the enterprise architecture function gets tolerated at most, and given total short shrift during bad times (such as current times). Focus moves to doing change projects faster and cheaper. The old wisdom is however easily forgotten, in being penny wise on short term project costs and time lines, we ignore the pound foolishness of bloated cost base and compromised time to market for future changes. However I also tend to agree that business change cannot wait for the right architecture to be put in place first. Businesses normally have window of opportunity to cash in, and cash in they will - with, without or despite IT.
Thats where the concept of Enterprise IT Oracle comes in. The Enterprise IT Oracle will let IT predict the future as envisaged by business, and put the right architecture (with appropriate governance) in place even before the need arises. It kind of puts IT in an offside position (soccer/football term) so that when business wants to pass the ball, IT is ready to receive it and shoot it into the goal. It is my firm belief that without such 'Oracle' function, enterprise architecture functions in reactive mode will never be able to rise to occasions.
Over a period of time, the architectural environment, may it be business, application or technology architecture gets contaminated. This leads to bloated cost base and starts affecting time to market for business change. Approaches such as portfolio rationalisation can restore order temporarily. But to retain some semblance of order all the time, a proper enterprise architecture function must oversee the enterprise IT.
However what I have seen happening in practice is that at best of times the enterprise architecture function gets tolerated at most, and given total short shrift during bad times (such as current times). Focus moves to doing change projects faster and cheaper. The old wisdom is however easily forgotten, in being penny wise on short term project costs and time lines, we ignore the pound foolishness of bloated cost base and compromised time to market for future changes. However I also tend to agree that business change cannot wait for the right architecture to be put in place first. Businesses normally have window of opportunity to cash in, and cash in they will - with, without or despite IT.
Thats where the concept of Enterprise IT Oracle comes in. The Enterprise IT Oracle will let IT predict the future as envisaged by business, and put the right architecture (with appropriate governance) in place even before the need arises. It kind of puts IT in an offside position (soccer/football term) so that when business wants to pass the ball, IT is ready to receive it and shoot it into the goal. It is my firm belief that without such 'Oracle' function, enterprise architecture functions in reactive mode will never be able to rise to occasions.
Wednesday, February 04, 2009
SOA - Dead or alive
I had doubted SOA hype years ago in this post about SOA hype. Recently a small storm was raised when Anne Thomas Manes of Burton group blogged about untimely demise of SOA. But curiously, I am not so pessimstic anymore. My feeling is that SOA as an arhcitecture style is now better understood for what it is, than a magic technology silver bullet intended to solve all of enterprise IT problems. This short post from Ali is quite insightful in that context. What it is suggesting is that SOA has moved on to next step on its evolutionary path. Which is a good sign.
I always believed the true value proposition for SOA was not sharing (a.k.a. reuse) of services, and definitely not cost reduction. As sharing is important for provider who has mass consumers adding further value to digitised services provided. In enterprise scenario this may not be of much importance, unless you are in business of provision of digitised services. But sharing was a necessary step before SOA moved on to next stage of evolution where it makes more sense for enterprises, which are not in business of providing digitised services to masses for further value addition.
This is because sharing of services requires service contract standards(WSDL) and service discovery standards(UDDI) albeit in weak form. It also needed mechanism, to determine what is to be shared and at what granularity, to be established. These standards and technologies are in place and mature now. With these in place, implementing shared service is quite straight forward. It also set stage for next step of this evolutionary journey.
The next step of evolution, viz. flexible services where flexibility is at hands of providers, is progressing as I am blogging. The charge is led incidentally by progress made in BPM world. The more we know about processes, process composition and orchestration especially in a human task centric processes, we will make more progress on flexibility aspet. It would throw up its own set of standards and technologies, e.g. BPMN, Services Fabric based on industry models etc. So having process standards and technologies create need for limited sharing of services, which now can be provisioned using standards and tools in place already. At this stage benefits of SOA start becoming visible for enterprises. It would show up as agile and flexible processes. Processes which can be measured, monitored and flexed easily. When this becomes mainstream then stage will be set for ultimate goal of SOA, viz. flexibility at hand of consumer (a.k.a virtualisation).
I believe for virtualisation to become mainstream we would need ability to procure, provision & manage services in a location independant manner(a.k.a. in the cloud). We would need some form of semantic compatibility guarantees on the fly (kind of expected in semantic web). At this stage of evolution we may also see an ITIL like methodology evolving for business service management. I also believe it is not necessary for all services to be virtualised. What services need to be virtualised in what context would remain a very specific decision for each organisation. (Or you end up finding your business getting disintermediated using the services provided by you).
So don't let all the doom and gloom news get on to your nerves. It may be just current economic climate reflecting in people's feelings. It may not be possible anymore just to declare an IT project as 'SOA' and get funding from business. But from an enterprise architecture viewpoint SOA looks well set to be de-facto architectural style of IT in future.
I always believed the true value proposition for SOA was not sharing (a.k.a. reuse) of services, and definitely not cost reduction. As sharing is important for provider who has mass consumers adding further value to digitised services provided. In enterprise scenario this may not be of much importance, unless you are in business of provision of digitised services. But sharing was a necessary step before SOA moved on to next stage of evolution where it makes more sense for enterprises, which are not in business of providing digitised services to masses for further value addition.
This is because sharing of services requires service contract standards(WSDL) and service discovery standards(UDDI) albeit in weak form. It also needed mechanism, to determine what is to be shared and at what granularity, to be established. These standards and technologies are in place and mature now. With these in place, implementing shared service is quite straight forward. It also set stage for next step of this evolutionary journey.
The next step of evolution, viz. flexible services where flexibility is at hands of providers, is progressing as I am blogging. The charge is led incidentally by progress made in BPM world. The more we know about processes, process composition and orchestration especially in a human task centric processes, we will make more progress on flexibility aspet. It would throw up its own set of standards and technologies, e.g. BPMN, Services Fabric based on industry models etc. So having process standards and technologies create need for limited sharing of services, which now can be provisioned using standards and tools in place already. At this stage benefits of SOA start becoming visible for enterprises. It would show up as agile and flexible processes. Processes which can be measured, monitored and flexed easily. When this becomes mainstream then stage will be set for ultimate goal of SOA, viz. flexibility at hand of consumer (a.k.a virtualisation).
I believe for virtualisation to become mainstream we would need ability to procure, provision & manage services in a location independant manner(a.k.a. in the cloud). We would need some form of semantic compatibility guarantees on the fly (kind of expected in semantic web). At this stage of evolution we may also see an ITIL like methodology evolving for business service management. I also believe it is not necessary for all services to be virtualised. What services need to be virtualised in what context would remain a very specific decision for each organisation. (Or you end up finding your business getting disintermediated using the services provided by you).
So don't let all the doom and gloom news get on to your nerves. It may be just current economic climate reflecting in people's feelings. It may not be possible anymore just to declare an IT project as 'SOA' and get funding from business. But from an enterprise architecture viewpoint SOA looks well set to be de-facto architectural style of IT in future.
Subscribe to:
Posts (Atom)
Friday, February 27, 2009
Of TLAs and FLAs
In my previous organisation every employee used to have a three letter acronym (TLA) made from employee first, middle and last name, instead of an employee number. There was an anecdote about it as well. The company's then chief, who is also called "Father of the Indian Software Industry" had actually ordered a four letter acronym (FLA). But the software that was used to generate those acronyms, produced a nasty one for the chief. (If you know his name, you can imagine what that four letter word would have been). So he ordered it to be changed to a three letter one. (BTW, Many happy returns of the day Mr. Kohli).
It appears to be going in reverse within IT. In IT, there were a lot of TLAs. ERP, SCM, CRM, EAI, BPM and SOA to name a few you may have come across. Of late however the trend is moving towards FLAs, what with SaaS, PaaS and so on. However the most enduring acronym which survived the test of time is neither a three letter one nor a four letter one. It is actually a five letter one - RDBMS.
The reason it has survived for this long, is because it is more than an acronym. It is a well thought out piece of technology backed by solid science. It is not just an acronym coined by sales and marketing guys nor an industry analyst. This technology has proven to be easily standardised, exetensible and serving the vast array of requirements some of which was not even envisaged when technology was initially developed.
Sadly same cannot be said of all the technologies you see around these days. Many of them are rehash of old ideas in new packaging. They rely on finding the right audience at right time to proliferate and thrive on inherent problems they carry to generate more revenues for their owners.
Enterprise architects need to possess the vision to see thru the marketing fluff and reach the bare bones of technologies they are going to employ. Analysts can help to an extent, but the generic analysis may not be completely applicable in your situation. You need to equip your 'Oracle' function with this capability.
It appears to be going in reverse within IT. In IT, there were a lot of TLAs. ERP, SCM, CRM, EAI, BPM and SOA to name a few you may have come across. Of late however the trend is moving towards FLAs, what with SaaS, PaaS and so on. However the most enduring acronym which survived the test of time is neither a three letter one nor a four letter one. It is actually a five letter one - RDBMS.
The reason it has survived for this long, is because it is more than an acronym. It is a well thought out piece of technology backed by solid science. It is not just an acronym coined by sales and marketing guys nor an industry analyst. This technology has proven to be easily standardised, exetensible and serving the vast array of requirements some of which was not even envisaged when technology was initially developed.
Sadly same cannot be said of all the technologies you see around these days. Many of them are rehash of old ideas in new packaging. They rely on finding the right audience at right time to proliferate and thrive on inherent problems they carry to generate more revenues for their owners.
Enterprise architects need to possess the vision to see thru the marketing fluff and reach the bare bones of technologies they are going to employ. Analysts can help to an extent, but the generic analysis may not be completely applicable in your situation. You need to equip your 'Oracle' function with this capability.
Friday, February 13, 2009
IBM in cloud.
I read about Nick Carr commenting on IBM putting their infrastructure software in Amazon EC2. He is comparing it with IBM's disastrous decision to leave IP rights of Microsoft-Dos with, well, MicroSoft. Is this recent decision really comparable?
In case of PC, in hind sight it appears that IBM had erroneously assumed that their micro-processor based PC was non commoditisable, whereas Microsoft correctly judged that anyone could assemble a PC using off-the-shelf microprocessors from Intel. So Microsoft retained IP on software which it could use to monopolise the commodity PC market.
So the question in this scenario is what is likely to be commoditised? Are EC2 services that unique that no-one else could replicate? What is the barrier to entry?
Certainly not technology. If my memory serves me right, IBM itself is big daddy of virtualisation and what it calls utility computing. It had developed first hypervisor called VM CP, which used to run on S/xxx mainframes. In recent times, I remember using IBM provided Linux image on a mainframe in 2001 to do some personal project (I was trying to do some code visualisation into model using open source tools on IBM provided Linux image.) IBM had provisioned that image in 24 hours.
So what is it? My guess is IBM is using EC2 to hook users onto its IP and then move them to a different Cloud (possibly its own). I think IBM is still not worked out the business model around its own cloud offering for SMB. Not a bad thing. It will surely standardise cloud computing space and make enterprise scale software stack available even to SMBs. It might hasten uptake of private cloud offerings too. So everyone will benefit. It would be interesting to see how this works out in next few years. I am definitely going to watch the progress.
In case of PC, in hind sight it appears that IBM had erroneously assumed that their micro-processor based PC was non commoditisable, whereas Microsoft correctly judged that anyone could assemble a PC using off-the-shelf microprocessors from Intel. So Microsoft retained IP on software which it could use to monopolise the commodity PC market.
So the question in this scenario is what is likely to be commoditised? Are EC2 services that unique that no-one else could replicate? What is the barrier to entry?
Certainly not technology. If my memory serves me right, IBM itself is big daddy of virtualisation and what it calls utility computing. It had developed first hypervisor called VM CP, which used to run on S/xxx mainframes. In recent times, I remember using IBM provided Linux image on a mainframe in 2001 to do some personal project (I was trying to do some code visualisation into model using open source tools on IBM provided Linux image.) IBM had provisioned that image in 24 hours.
So what is it? My guess is IBM is using EC2 to hook users onto its IP and then move them to a different Cloud (possibly its own). I think IBM is still not worked out the business model around its own cloud offering for SMB. Not a bad thing. It will surely standardise cloud computing space and make enterprise scale software stack available even to SMBs. It might hasten uptake of private cloud offerings too. So everyone will benefit. It would be interesting to see how this works out in next few years. I am definitely going to watch the progress.
Thursday, February 12, 2009
Darwin and enterprise architecture
Today is 200th birth anniversary one of the great thinkers of modern times. Darwin presented us with the theory of evolution, summed up as 'natural selection' or 'survival of the fittest'. We also know that applying Darwinism in all spheres (e.g. social sphere - a la Nietzsche/Hitler) is not a good idea. However thats what tends to happen when it comes to enterprise IT. The evolution of enterprise IT tends to follow, the evolution of enterprise itself. As enterprises evolve, different parts of its value chain become important and get more attention (consequently more resource allocation). This weird sort of natural selection leads to multiplicity of systems and infrastructure, which tend to overlap or in rare cases leave gaps.
Over a period of time, the architectural environment, may it be business, application or technology architecture gets contaminated. This leads to bloated cost base and starts affecting time to market for business change. Approaches such as portfolio rationalisation can restore order temporarily. But to retain some semblance of order all the time, a proper enterprise architecture function must oversee the enterprise IT.
However what I have seen happening in practice is that at best of times the enterprise architecture function gets tolerated at most, and given total short shrift during bad times (such as current times). Focus moves to doing change projects faster and cheaper. The old wisdom is however easily forgotten, in being penny wise on short term project costs and time lines, we ignore the pound foolishness of bloated cost base and compromised time to market for future changes. However I also tend to agree that business change cannot wait for the right architecture to be put in place first. Businesses normally have window of opportunity to cash in, and cash in they will - with, without or despite IT.
Thats where the concept of Enterprise IT Oracle comes in. The Enterprise IT Oracle will let IT predict the future as envisaged by business, and put the right architecture (with appropriate governance) in place even before the need arises. It kind of puts IT in an offside position (soccer/football term) so that when business wants to pass the ball, IT is ready to receive it and shoot it into the goal. It is my firm belief that without such 'Oracle' function, enterprise architecture functions in reactive mode will never be able to rise to occasions.
Over a period of time, the architectural environment, may it be business, application or technology architecture gets contaminated. This leads to bloated cost base and starts affecting time to market for business change. Approaches such as portfolio rationalisation can restore order temporarily. But to retain some semblance of order all the time, a proper enterprise architecture function must oversee the enterprise IT.
However what I have seen happening in practice is that at best of times the enterprise architecture function gets tolerated at most, and given total short shrift during bad times (such as current times). Focus moves to doing change projects faster and cheaper. The old wisdom is however easily forgotten, in being penny wise on short term project costs and time lines, we ignore the pound foolishness of bloated cost base and compromised time to market for future changes. However I also tend to agree that business change cannot wait for the right architecture to be put in place first. Businesses normally have window of opportunity to cash in, and cash in they will - with, without or despite IT.
Thats where the concept of Enterprise IT Oracle comes in. The Enterprise IT Oracle will let IT predict the future as envisaged by business, and put the right architecture (with appropriate governance) in place even before the need arises. It kind of puts IT in an offside position (soccer/football term) so that when business wants to pass the ball, IT is ready to receive it and shoot it into the goal. It is my firm belief that without such 'Oracle' function, enterprise architecture functions in reactive mode will never be able to rise to occasions.
Wednesday, February 04, 2009
SOA - Dead or alive
I had doubted SOA hype years ago in this post about SOA hype. Recently a small storm was raised when Anne Thomas Manes of Burton group blogged about untimely demise of SOA. But curiously, I am not so pessimstic anymore. My feeling is that SOA as an arhcitecture style is now better understood for what it is, than a magic technology silver bullet intended to solve all of enterprise IT problems. This short post from Ali is quite insightful in that context. What it is suggesting is that SOA has moved on to next step on its evolutionary path. Which is a good sign.
I always believed the true value proposition for SOA was not sharing (a.k.a. reuse) of services, and definitely not cost reduction. As sharing is important for provider who has mass consumers adding further value to digitised services provided. In enterprise scenario this may not be of much importance, unless you are in business of provision of digitised services. But sharing was a necessary step before SOA moved on to next stage of evolution where it makes more sense for enterprises, which are not in business of providing digitised services to masses for further value addition.
This is because sharing of services requires service contract standards(WSDL) and service discovery standards(UDDI) albeit in weak form. It also needed mechanism, to determine what is to be shared and at what granularity, to be established. These standards and technologies are in place and mature now. With these in place, implementing shared service is quite straight forward. It also set stage for next step of this evolutionary journey.
The next step of evolution, viz. flexible services where flexibility is at hands of providers, is progressing as I am blogging. The charge is led incidentally by progress made in BPM world. The more we know about processes, process composition and orchestration especially in a human task centric processes, we will make more progress on flexibility aspet. It would throw up its own set of standards and technologies, e.g. BPMN, Services Fabric based on industry models etc. So having process standards and technologies create need for limited sharing of services, which now can be provisioned using standards and tools in place already. At this stage benefits of SOA start becoming visible for enterprises. It would show up as agile and flexible processes. Processes which can be measured, monitored and flexed easily. When this becomes mainstream then stage will be set for ultimate goal of SOA, viz. flexibility at hand of consumer (a.k.a virtualisation).
I believe for virtualisation to become mainstream we would need ability to procure, provision & manage services in a location independant manner(a.k.a. in the cloud). We would need some form of semantic compatibility guarantees on the fly (kind of expected in semantic web). At this stage of evolution we may also see an ITIL like methodology evolving for business service management. I also believe it is not necessary for all services to be virtualised. What services need to be virtualised in what context would remain a very specific decision for each organisation. (Or you end up finding your business getting disintermediated using the services provided by you).
So don't let all the doom and gloom news get on to your nerves. It may be just current economic climate reflecting in people's feelings. It may not be possible anymore just to declare an IT project as 'SOA' and get funding from business. But from an enterprise architecture viewpoint SOA looks well set to be de-facto architectural style of IT in future.
I always believed the true value proposition for SOA was not sharing (a.k.a. reuse) of services, and definitely not cost reduction. As sharing is important for provider who has mass consumers adding further value to digitised services provided. In enterprise scenario this may not be of much importance, unless you are in business of provision of digitised services. But sharing was a necessary step before SOA moved on to next stage of evolution where it makes more sense for enterprises, which are not in business of providing digitised services to masses for further value addition.
This is because sharing of services requires service contract standards(WSDL) and service discovery standards(UDDI) albeit in weak form. It also needed mechanism, to determine what is to be shared and at what granularity, to be established. These standards and technologies are in place and mature now. With these in place, implementing shared service is quite straight forward. It also set stage for next step of this evolutionary journey.
The next step of evolution, viz. flexible services where flexibility is at hands of providers, is progressing as I am blogging. The charge is led incidentally by progress made in BPM world. The more we know about processes, process composition and orchestration especially in a human task centric processes, we will make more progress on flexibility aspet. It would throw up its own set of standards and technologies, e.g. BPMN, Services Fabric based on industry models etc. So having process standards and technologies create need for limited sharing of services, which now can be provisioned using standards and tools in place already. At this stage benefits of SOA start becoming visible for enterprises. It would show up as agile and flexible processes. Processes which can be measured, monitored and flexed easily. When this becomes mainstream then stage will be set for ultimate goal of SOA, viz. flexibility at hand of consumer (a.k.a virtualisation).
I believe for virtualisation to become mainstream we would need ability to procure, provision & manage services in a location independant manner(a.k.a. in the cloud). We would need some form of semantic compatibility guarantees on the fly (kind of expected in semantic web). At this stage of evolution we may also see an ITIL like methodology evolving for business service management. I also believe it is not necessary for all services to be virtualised. What services need to be virtualised in what context would remain a very specific decision for each organisation. (Or you end up finding your business getting disintermediated using the services provided by you).
So don't let all the doom and gloom news get on to your nerves. It may be just current economic climate reflecting in people's feelings. It may not be possible anymore just to declare an IT project as 'SOA' and get funding from business. But from an enterprise architecture viewpoint SOA looks well set to be de-facto architectural style of IT in future.
Subscribe to:
Posts (Atom)