Readers of my previous post on cloud computing might also be interested in how Vivek Kundra began thinking about the advantages of cloud computing as chief technology officer for Washington, D.C. and implemented Google Apps for employees. Instead of using Google Apps (including email) as a replacement for Exchange he began a process of voluntary transfer http://bit.ly/ct0Crr. Back in 2008Kundra stated 'Around 5,000 people are actively using it as we speak. We have another 3,000 people going through migration.' This is from 38,000 licenses purchased. Interesting approach to the take on of new technlogy.
Vivek Kundra is now the Federal Chief Information Officer of the United States appointed by President Obama in March 2009.
Wednesday, 24 February 2010
Tuesday, 23 February 2010
A Strategy for Cloud Computing
So you’ve heard all the hype about Cloud computing and are now thinking about how you might enter the fray, test it out, give it a try? You even have the endorsement of Cabinet Office and the Government ICT Strategy let alone secure networks (PSN) and the G Cloud under development. So what services would it be best to consider?
My colleague Martin Howitt (Enterprise Architect from Devon CC) has very rationally suggested that it is integration and SOA that is the holy grail of cloud computing and that is where we should invest. It’s hard to disagree with Martin looking at it from a purely architectural viewpoint (http://forestandtrees.wordpress.com/author/martinhowitt/).
However, in my experience which is admittedly public sector biased, this type of decision is more political rather than taking the best technical option. In Local Government individual Directors, as head of professions, hold enormous power and virtually run their Directorates as private companies with the Local Authority acting as the conglomerate HQ. It’s hard to convince all powerful Directors that they should act for the good of the Local Authority and take all the risk on behalf of the others. The decision really requires a little more subtlety by CIOs keen to move into Cloud computing.
The obvious first step is to use cloud as a stop gap – use it for development and testing services initially to prove capability rather than going straight for full applications hosting. This takes advantage of the Cloud service providers’ capability to build platforms quickly which can speed development and reduce pressure on critical staff. There is no need to procure and pay for server hardware as you only pay for actual usage. The risks at the development stage are also a lot less than for losing a production service (and data). The cloud offers big opportunities for short term hosting such as testing upgrades which only lasts for a short period. Best practice would substantially reduce the number of environments to permanently maintain whilst still retaining flexibility to restore known end states. That should increase the Cloud’s credibility whilst saving funding on new projects.
So you’ve proved the capability of your cloud services provider on development services and are willing to take the next step of hosting applications? My approach would be to look for a utility application, core to my business, but not likely to make any major technology leaps in the near future and one that my cloud services operator has heaps of experience of. You could use a portfolio analysis to help you analyse things (in McFarlan’s IT Portfolio Analysis – Factory systems) but I think there is one application that stands out. Email - a standard application, configuration and hardware that will save bundles of cost. Not only that, but it does not have a clear service owner that could object.
I would also look to achieve one other major benefit. Every few years, even if you miss one, you have to upgrade. This is a major headache where you put in lots of effort and only perform once. How much easier to pass this over to an experienced provider that has done it before, has the ‘T’ shirt and probably all the scars. Let them take the pain and roll it out on a phased basis. Undoubtedly, this would be my first choice for hosting a service via the cloud. Next I’d be looking for other similar applications. Standard operations (in every sense) where I could cut costs – ISPs and shared workspaces spring to mind.
Having made the leap into Cloud and everything is now settled and operating well I’d be looking to see where I might gain some ‘advantage’. Obviously being Local Authorities we don’t actually complete in a commercial sense but to the uninitiated do not be fooled. It’s a bit like F1, the first person you want to beat is your team mate. (For central government think departments – DWP and HMRC for example.) I’d be looking to really exploit collaboration services. Not as many have already achieved by implementing Sharepoint internally but working externally with the Local Strategic Partnership. Incorporating other public sector bodies and significantly third sector organisations (TSOs). Major performance benefits could be achieved by sharing project based information with partners.
Note now I have moved away from the Factory based systems of McFarlan. We are in the area of Turnaround systems which are more future based, innovative and experimental. In a word ‘risky’. For this reason I would be less inclined to go with a standard solution but more favour an innovative or Open Source option. It remains to be seen if the service providers of the G Cloud will be prepared to take that risk or stick with the traditional ‘safe’ options? Certainly within SOCITM, there are individuals such as Glenn Wood (Wolverhampton) who are keen to champion the open alternatives as well traditional suppliers.
Since moving out of direct Local/Central Government I have been struck by the innovation that has taken place in the Social Media arena and how central based systems are being left behind by the extent of innovation in these spaces. Frankly it reminds me of the late 80’s when the first computer networks were challenging the heartlands of the traditional mainframe suppliers. Perhaps this really deserves its own blog but I am still struck by the number of CIOs that are not engaged (don’t even have Twitter accounts) and the fact that we have systems and procedures in place that prohibit their use at work from council/departmental systems.
So, my strategy for the cloud is to start safe using development services then move into utility and finally new innovative services. But what about legacy systems, I hear you say. Well, my view is that unless you can justify moving them onto standardised platforms then it will be difficult to develop the business case. That does not mean to say that you can’t move towards common (shared) data centres but the benefits of the cloud require dumping those traditional architectures. And so those unmentionable mainframes in central government chug on, and on, and on……….
My colleague Martin Howitt (Enterprise Architect from Devon CC) has very rationally suggested that it is integration and SOA that is the holy grail of cloud computing and that is where we should invest. It’s hard to disagree with Martin looking at it from a purely architectural viewpoint (http://forestandtrees.wordpress.com/author/martinhowitt/).
However, in my experience which is admittedly public sector biased, this type of decision is more political rather than taking the best technical option. In Local Government individual Directors, as head of professions, hold enormous power and virtually run their Directorates as private companies with the Local Authority acting as the conglomerate HQ. It’s hard to convince all powerful Directors that they should act for the good of the Local Authority and take all the risk on behalf of the others. The decision really requires a little more subtlety by CIOs keen to move into Cloud computing.
The obvious first step is to use cloud as a stop gap – use it for development and testing services initially to prove capability rather than going straight for full applications hosting. This takes advantage of the Cloud service providers’ capability to build platforms quickly which can speed development and reduce pressure on critical staff. There is no need to procure and pay for server hardware as you only pay for actual usage. The risks at the development stage are also a lot less than for losing a production service (and data). The cloud offers big opportunities for short term hosting such as testing upgrades which only lasts for a short period. Best practice would substantially reduce the number of environments to permanently maintain whilst still retaining flexibility to restore known end states. That should increase the Cloud’s credibility whilst saving funding on new projects.
So you’ve proved the capability of your cloud services provider on development services and are willing to take the next step of hosting applications? My approach would be to look for a utility application, core to my business, but not likely to make any major technology leaps in the near future and one that my cloud services operator has heaps of experience of. You could use a portfolio analysis to help you analyse things (in McFarlan’s IT Portfolio Analysis – Factory systems) but I think there is one application that stands out. Email - a standard application, configuration and hardware that will save bundles of cost. Not only that, but it does not have a clear service owner that could object.
I would also look to achieve one other major benefit. Every few years, even if you miss one, you have to upgrade. This is a major headache where you put in lots of effort and only perform once. How much easier to pass this over to an experienced provider that has done it before, has the ‘T’ shirt and probably all the scars. Let them take the pain and roll it out on a phased basis. Undoubtedly, this would be my first choice for hosting a service via the cloud. Next I’d be looking for other similar applications. Standard operations (in every sense) where I could cut costs – ISPs and shared workspaces spring to mind.
Having made the leap into Cloud and everything is now settled and operating well I’d be looking to see where I might gain some ‘advantage’. Obviously being Local Authorities we don’t actually complete in a commercial sense but to the uninitiated do not be fooled. It’s a bit like F1, the first person you want to beat is your team mate. (For central government think departments – DWP and HMRC for example.) I’d be looking to really exploit collaboration services. Not as many have already achieved by implementing Sharepoint internally but working externally with the Local Strategic Partnership. Incorporating other public sector bodies and significantly third sector organisations (TSOs). Major performance benefits could be achieved by sharing project based information with partners.
Note now I have moved away from the Factory based systems of McFarlan. We are in the area of Turnaround systems which are more future based, innovative and experimental. In a word ‘risky’. For this reason I would be less inclined to go with a standard solution but more favour an innovative or Open Source option. It remains to be seen if the service providers of the G Cloud will be prepared to take that risk or stick with the traditional ‘safe’ options? Certainly within SOCITM, there are individuals such as Glenn Wood (Wolverhampton) who are keen to champion the open alternatives as well traditional suppliers.
Since moving out of direct Local/Central Government I have been struck by the innovation that has taken place in the Social Media arena and how central based systems are being left behind by the extent of innovation in these spaces. Frankly it reminds me of the late 80’s when the first computer networks were challenging the heartlands of the traditional mainframe suppliers. Perhaps this really deserves its own blog but I am still struck by the number of CIOs that are not engaged (don’t even have Twitter accounts) and the fact that we have systems and procedures in place that prohibit their use at work from council/departmental systems.
So, my strategy for the cloud is to start safe using development services then move into utility and finally new innovative services. But what about legacy systems, I hear you say. Well, my view is that unless you can justify moving them onto standardised platforms then it will be difficult to develop the business case. That does not mean to say that you can’t move towards common (shared) data centres but the benefits of the cloud require dumping those traditional architectures. And so those unmentionable mainframes in central government chug on, and on, and on……….
Labels:
'G' Cloud,
Cloud,
PSN,
Shared Services,
Social Media
Thursday, 4 February 2010
Safer Driving?
This week, I had to drive to Cheltenham for the Design Review of the Devon/Torbay Flexible Working solution. It uses the Becrypt Trusted Client Solution. I have been advising them on policies and a design that might allow Local Government to use non-council controlled PCs and laptops to access GCSX (IL2 only). This was to be a big day for the project and the team were all pretty nervous never having faced a full design review before.
I suppose I was more focused on the afternoon than the route I was driving but suddenly realised I was travelling on roads I had last used over 30 years ago when I was still a student. My family still lived in Birmingham then and I often traveled back from Oxford over the Cotswolds to cut out the endless British Leyland lorries on the old A40. No M40 then and all single carriageway roads!
What struck me though was how different the roads were. It was as if someone was desperate to use up gallons and gallons of paint. Everywhere was cross hatching prohibiting overtaking, warning me to slow down for roundabouts, red colouring for villages etc. I ought to say it was safer but frankly it was incredibly tiring. It was a bit like a real life video game with new threats coming at you all the time - except the threats were 'painted' not real. I began to think what if a real threat occurred - a pedestrian, cyclist or a child suddenly running out. Would I actually spot them quickly enough given all the other distractions. A good driver like me (?) really didn't need all this interference.
I'd like to think that the engineering professionals would have endless surveys proving that I was actually more alert, driving at a more sensible pace, not trying to pass the lorry in front as soon as the road clears but I have my doubts.
I laughed when I realised the argument was not unlike that of those used by the critics of the GCSX Code of Connection. It was so much better before all these restrictions and controls. We've had to spend a fortune upgrading and for what benefit etc. You know that rant. I suppose I do have some sympathy even if I did spend 3 years driving in new standards of Information Assurance and Data Handling for Local Government . Yes, the CoCo probably can be improved to make it more relevant to IL2 regimes. It does need to take account of mobile and flexible working. Are the threats really that relevant to Local Government? There is a big issue with the third sector that LG needs to connect to and why is Health different?
It is to be hoped that we can make improvements with the PSN CoCo. There are key meetings coming up to thrash out the issues. The problem is that you do have to have some standards and that means giving up some of the freedom and flexibility that we used to have. Think back - roads were less crowded then, cars were slower, they didn't have safety equipment like air bags, no driver aids such as anti-lock brakes, and were dreadfully uncomfortable etc. On reflection, I think I'll stick with the cars and roads of today. And the march of Information Assurance? Well, when you consider all the new threats we face today - viruses, phishing, denial of service, identity theft, e-crime then the world has changed.
I wouldn't drive today like I did back then - overtaking lorries in an under-powered 1200cc Ford Cortina with no safety crumple zones to save me if I got it wrong! And so we also have to recognise that Information Assurance is now a fact of life. Even if it doesn't seem quite as much fun as it did in those days!
I suppose I was more focused on the afternoon than the route I was driving but suddenly realised I was travelling on roads I had last used over 30 years ago when I was still a student. My family still lived in Birmingham then and I often traveled back from Oxford over the Cotswolds to cut out the endless British Leyland lorries on the old A40. No M40 then and all single carriageway roads!
What struck me though was how different the roads were. It was as if someone was desperate to use up gallons and gallons of paint. Everywhere was cross hatching prohibiting overtaking, warning me to slow down for roundabouts, red colouring for villages etc. I ought to say it was safer but frankly it was incredibly tiring. It was a bit like a real life video game with new threats coming at you all the time - except the threats were 'painted' not real. I began to think what if a real threat occurred - a pedestrian, cyclist or a child suddenly running out. Would I actually spot them quickly enough given all the other distractions. A good driver like me (?) really didn't need all this interference.
I'd like to think that the engineering professionals would have endless surveys proving that I was actually more alert, driving at a more sensible pace, not trying to pass the lorry in front as soon as the road clears but I have my doubts.
I laughed when I realised the argument was not unlike that of those used by the critics of the GCSX Code of Connection. It was so much better before all these restrictions and controls. We've had to spend a fortune upgrading and for what benefit etc. You know that rant. I suppose I do have some sympathy even if I did spend 3 years driving in new standards of Information Assurance and Data Handling for Local Government . Yes, the CoCo probably can be improved to make it more relevant to IL2 regimes. It does need to take account of mobile and flexible working. Are the threats really that relevant to Local Government? There is a big issue with the third sector that LG needs to connect to and why is Health different?
It is to be hoped that we can make improvements with the PSN CoCo. There are key meetings coming up to thrash out the issues. The problem is that you do have to have some standards and that means giving up some of the freedom and flexibility that we used to have. Think back - roads were less crowded then, cars were slower, they didn't have safety equipment like air bags, no driver aids such as anti-lock brakes, and were dreadfully uncomfortable etc. On reflection, I think I'll stick with the cars and roads of today. And the march of Information Assurance? Well, when you consider all the new threats we face today - viruses, phishing, denial of service, identity theft, e-crime then the world has changed.
I wouldn't drive today like I did back then - overtaking lorries in an under-powered 1200cc Ford Cortina with no safety crumple zones to save me if I got it wrong! And so we also have to recognise that Information Assurance is now a fact of life. Even if it doesn't seem quite as much fun as it did in those days!
Labels:
Becrypt,
Code of Connection,
design review,
GCSX,
information assurance,
PSN,
remote access,
security
Subscribe to:
Posts (Atom)