Cloud Computing Russell Mickler Cloud Computing Russell Mickler

10 Risks of Cloud Computing

@thescottydont asks on Facebook:

"True. However, seems like this video is missing some down sides. I ask you, Guru Mickler - are there down sides to Cloud Computing?"

Well, what a great question! And I think that it's only fair to honestly talk about the downsides to Cloud Computing as a small business strategy.

1. Application Maturity.

Today, many of the applications that you'd find hosted online as a cloud option use browser-based interfaces. Because of that, they're relatively simplistic as compared to applications installed on your local hard drive or server. In many cases, they're "no frills" or offer reduced capabilities or functions. This is changing very rapidly, though, as there's a rush to create dynamic and powerful applications that can be distributed within the browser. Just in looking at the maturity and growth of Google's Sites and Docs, or Zoho, in the last two years is pretty astounding. Cloud applications are quickly becoming more feature-rich, centralized (thereby reducing total costs of ownership), and available on any platform that runs a browser.

2. Application Ease of Use.

There are constraints on web programming that prevent a "fluidity" of movement within the user interface. It's nothing for many users to have multiple applications up and once, then copy and embed objects with ease, or use many different interfaces and toolbars to manage the software. The web model forces programmers to be a little more terse having to save records at certain periods, or, step a user back into three data entry screens for what used to take just one screen to perform. Users "feel" this when they're using web-based applications, that there are multiple steps that they're forced to walk through to get something done. This can be problematic and time consuming, and a detractor against adoption.

3. Internet Availability.

If the Internet is down, then your application and the data is unavailable. To some firms, this may be more than an annoyance but something that shuts down critical functions. Sometimes when I travel, I go to places and - gasp - no Internet: I'm effectively shut down! Their connectivity may be spotty or unreliable in the first place, which would make them reconsider the Cloud strategy. Indeed, the cloud strategy does depend on a "always on" Internet, and in many areas of the company, that reliability simply isn't available. The consideration here is probably thinking about how often the Internet goes down in comparison to a server which is much lower in terms of frequency. And, what the potential likelihood of no Internet access looks like.

4. Latency.

Broadband pricing is dropping dramatically but the risk is still there: yes, you could have a slow Internet connection. We're measuring cable bandwidth for consumers and businesses now between 50-100mbps, and large companies - of course - can afford fiber and optical carrier solutions that push bandwidth into the gigabit range. Others, though, may still rely on T1's or xDSL. These slower connectivity solutions diminish the reliability and usefulness of the cloud, especially when the "mature" cloud application expects to use a lot of bandwidth to provide more ease of use.

5. Data Location.

Many people get jittery if their data isn't right next to them. Like, in the other room, or, near their feet on a microcomputer's hard disk. The presence and location of data is important to some; it shouldn't be ignored or discounted. Companies that want to have exclusive control over their data and intellectual property would choose in-house servers and likely avoid the cloud.

.

[adsense format="wide"]

.

6. Data Integration.

Data is somewhat stuck in the cloud. Now, there are programming interfaces that developers can use to integrate one source of data to another, and those standards are well-defined these days and power a lot of Internet integration, but the lay-user just doesn't understand how to use these API's (Application Programming Interfaces) and they don't know the consequences of their use. If your Twitter feed is your property, how can you capture your "property" and push the feed into another "property" online? How can you integrate that? Well, this isn't easy to do right now. Facebook has an interesting model of applications that end-users can install to create integrated pockets of data... that could be a model we'll see adopted by the cloud.

7. Security Risk.

Some would suggest that your data is at risk if it's not in your general proximity. If you can't touch it, it's insecure. I would dispute that. Having data closer to us on microcomputers hasn't "increased" security or reduced the number of identity thefts, or, compromised PC's, or botnets. It is precisely because microcomputers aren't professionally managed that they're more at-risk than cloud-based systems which are professionally managed by an army of engineers. We live with data hosted by others all of the time (the IRS, banking institutions, corporations, universities...) and we come to trust their expertise in securing it. That trust will come hard for some; for others, they may flat-out reject the idea that somebody can manage security better that they can, and, at a lower cost.

8. Privacy Risk.

Yes, Google can see your data when it's on their cloud. Now, they "tokenize" and "stem" the data, meaning that they convert your data into abstract ideas and symbols, then aggregate it, so they're not really "reading" your files or email. They're just counting up the number of symbols that they see, or, abstracting relating one idea to another. Of course, a lot of people don't understand that, but services like Google aren't prying-in on your data; they are, though, abstracting it. Some folks may not even like that. Now, online privacy is a sensitive subject that consumers have become very aware of, and cloud-based services are working overtime to create useful models for securing privacy. This is still in development, and will likely be so for many years, especially as federal law catches up to defining universally "what is private".

9. Interoperability Risk.

Because data is stored away from your own PC and servers, integrating that data - either in the cloud, or with other clouds, or with microcomputer applications - becomes a big challenge. Mail merge, for example: easily done in Office. You point Word to a data source, run a wizard, and you have envelopes and labels. Not so fast nor easy in the cloud. Nor is just copying objects like graphs from one application to another. Again, the solutions are getting better, and Zoho - for example - provides a lot of interactivity between its modules, but you have to kind of be a programmer to understand how to use them! So, that can be a detractor.

10. Learning Curve.

Users - conceptually - understand their local computer, their desktop, and their hard disk. They understand a jump drive. They understand Word and how its tools work. However, if you throw a general-use person into a new, browser-based interface, that is comparably slower to a local application, where data is stored somewhere that they don't comprehend (not like on a C: drive) and they don't know how to find it again, well, they get buggy. This causes a steep learning curve. Again, cloud applications are getting more sophisticated all the time, but the learning curve shouldn't be ignored: none of this happens seamlessly out of the box.

Conclusion

When considering the cloud, small business needs to adequately weigh pros and cons. They need to see where they could gain cloud advantages but keep the cloud as transparent as possible to the end user (like Google Apps integration with Microsoft Outlook). They need to see where more advanced users on their staff could be the pioneers and train others how to use this stuff, and get people excited about using it. The small business should recognize the economies of scale in cloud computing and the rewards that come from immediate capability, lower expenses, and shifted management obligations, while at the same time they need to appropriately weigh security and privacy concerns. Really, it's all about strategy: some pieces of the cloud may work where others might not, and, the landscape is constantly evolving, so the small business should be ready for it, planning on change, and planning to take advantage of new features as they come available.

I hope that answered your question.

R

Read More
Systems Russell Mickler Systems Russell Mickler

5 Rules of Thumb for Windows 7

Microsoft will release its latest version of Windows this week. If you're a small to mid-range business, here are five rules of thumb if you're considering an upgrade. 1. Don't.

Well, don't "upgrade". If you're technically capable, install the operating system clean instead of upgrading and restore your data from a backup. This may help avoid a lengthy upgrade process; some instances of upgrades from Vista to Windows 7 are reported to take upwards of six hours to complete. You can install the o/s cleanly and avoid many migration hassles. In fact, you may wish to just purchase a machine with Windows 7 already loaded on it and forget about the upgrade - especially if you're running Windows XP. There's no upgrade path from XP so you're going to have to install clean anyway.

2. Backups.

"Don't get cocky, kid." It was good advice from Han, and you can use it, too. You need to realize that an operating system upgrade isn't the same risk as installing a hotfix or a service pack. This is a major upgrade that may prevent your system from restarting correctly, and may even contribute to data loss. Don't be a fool: back up your critical files prior to the upgrade. In fact, now would be a great time to review why anything "critical" is on a PC anyway and try to relocate that data to your network. Start to consider your PC as a disposable asset whereas the real treasure of information is available on network resources. Even better, start thinking about what you can do to get rid of that server... but that's a topic for another day.

3. Compatibility.

Windows 7 is perfectly compatible with your existing Microsoft networking solutions. What you may run into are problems with legacy software applications and hardware drivers. You may want to make a list of critical applications that you just can't live without, and check with the vendor on whether or not they've been tested on Windows 7, and can run on that platform. Believe you me: you don't want to be in a situation where you've gone through the trouble of upgrading and suddenly can't run your mission-critical apps. As for hardware requirements, the kernel is the same as Vista, so if your machine ran Vista fine, you can expect the same performance and response from Windows 7, so let's dispell any idea that you'll get some performance of this.

4. What are your Reasons?

I think you need to be very clear - with yourself and your stakeholders - why an upgrade to Windows 7 is necessary. What's the strategic, compelling reason for an upgrade "right now". Is it timing? Is it compatibility with upcoming vendor solutions? Is now a better time than, say, 2nd QTR 2010? What motivates you for introducing the new window dressing now? Certainly, for the small to mid-range business, it's hard to see a strategic reason to upgrade to the next release of Windows very rapidly. Windows 7 will not change the way that you get your work done. It won't change the way you see the Internet. It won't change the way you collaborate and use data. It will change the way you view files on your hard disk, how your applications perform, and how you access the settings on your computer. Think very carefully about your reasons for introducing Windows 7 to you and your team right now.

5. Consider a Test.

When you buy a car, you want to test it. You want to drive it around the block a few times and make sure that it meets your needs. The same is true with 7. Don't take the hype for granted: install the upgrade on a separate, non-production workstation and see how you like the look and feel. Give it a whirl. Get some feedback from others in your workplace. They can sit behind the wheel and give it a shot, too. You can decide then if you really want to deal with the learning curve now, or, put it off until a later date.

Just a couple of practical pieces of advice before you take the plunge. Listen, be practical: think about a strategy. Who will you upgrade? When? Why? What's the purpose? When could you upgrade all of your machines, not just one, as to reduce the complexity of your technology environment? When is the best time to facilitate training with your staff? Which members of your staff would be good candidates for beta testing the product on a spare machine? What's your company going to do if the PC's don't start back up the way they should? How about a means to recover lost data? Don't look at this as something your company is doing for kicks and giggles: take it seriously, and have a plan.

R

Read More
Management, Strategy Russell Mickler Management, Strategy Russell Mickler

2010: Server Upgrade Choices for the SMB

choices1823 Many small to mid-range companies upgraded their server assets with the release of Windows Server 2003. The 2003 server was very popular and contributed to Microsoft retaining a 88% server market share, fending off Mac and Linux in this space for most of the decade.  However, as it's now five years old, the 2003 R1 Server product is entering its extended support phase, meaning direct support costs from will go up and Microsoft will begin releasing only limited updates for licensing, compatibility, and new features.  Hardware warranties on this equipment have long expired, and many of the hard disks/arrays in these units will be reaching their MTBF (Mean Time Before Failure) ratings this year; their likelihood of crashing is much higher. So, small to mid-range businesses should soon be thinking about their strategy for retiring the legacy asset.

Surely, the state of economy will likely push these decisions out for another 12-18 months as business owners stay put with what they got and contend with larger problems. Analysts are seeing the results of fear and uncertainty when we look at the results of Microsoft's last earnings report in June 2009: sales are down 17-percent from the previous quarter and profits are down 30-percent for the year. Okay, so you're a small business: what are your options?

It used to be that I had only three answers to that question: one, do nothing - ride out the extended support period for the next five years and assume increasing risk of hardware failure; two, upgrade the existing platform and the operating system to the current o/s release; three, replace the server entirely. This year, I'm really happy to say that the small business has many more choices.

Windows Server Upgrade and Asset Replacement.

The default position for most will be upgrading or replacing their Windows Server 2003 with a Windows Server 2008 - whether or not they do it now, or, wait for Windows Server 2008 R2 scheduled to be released Q4 2010. Companies that would do this are tied to the Microsoft Windows product: software solutions they own are locally-installed apps that require a centralized server on their LAN to function, and they use Windows, and these companies are forced to upgrade to maintain support on those legacy applications. Or, maybe they'll stay on the Windows platform because they know Microsoft and trust it. Either way, it's the default choice.

Doing Nothing and Staying Put.

An alternative is to do nothing. I think a lot of SMB (small to mid-range business) consumers are going to stay with Windows 2003 for as long as possible; similar to what we're likely to see with WindowsXP, Windows Server 2003 will be one of those microcomputer software products that will have an extreme, unheardof longevity encouraged by apathy for Microsoft's licensing costs and complexity. Business owners will find no compelling functional reason to upgrade, may be risk-adverse with the economy the way it is, may distrust other choices presented to them, or simply have shallow pockets.

.

[adsense format = "wide"]

.

Open Source.

The risk-tolerant SMB may look at the need for an in-house server because they have privacy concerns. They don't want anybody else holding their files, mail, or mission-critical data; they want to hold this stuff close to their chest and mistrust cloud or hosting options. On the other hand, this consumer isn't married to Microsoft: they just need a network appliance. Something that can perform backups and disaster recovery, centralize security, manage a database, file, and print services, and route email. Linux - particular the Ubuntu distribution - is a good choice here, either as an alternative to Windows with existing hardware, or an alternative o/s for a new machine. Up-front licensing costs would be much lower, and TCO (Total Cost of Ownership) would be comparable to a Windows Server. If the SMB is looking for a reliable appliance - just something that runs in the background and provides basic network services - and doesn't have a dependency upon Windows in their application portfolio, Open Source is a credible option in 2010.

Private Hosting.

Some companies just can't escape the need for a server - a Windows server or otherwise. They have mission-critical apps that are specific to their industry and they need to have that functionality. On the other hand, they don't want to be saddled with the cost and expense of owning a server; they want to get out of "ownership" and into "rental" or "leasing". They want to "rent" their capabilities, not "own" their capabilities, and cap their costs to a fixed-term, fixed-rate subscription expense that scales with their needs. Virtualization and terminal services make this option pretty attractive in terms of cost: some ROI projections see this as a 15-20% savings plus the added benefit of having your data and applications accessible anywhere you are, and, transferring the risk to a 2nd party.

Abandoning the Server - Cloud Computing.

Even more trendy is the fashionable idea of ditching the server entirely. Cloud computing is a more risk-tolerant model where we'd transfer your data and services to a 2nd party provider. Google, for example, could host and manage your email, your files, and information security and disaster recovery. The investment would be made in migrating the data away from servers to the 2nd party, then training and configuration expenses to get applications and devices to use the 2nd party. The ROI is fairly material: 30-50% savings as compared to owning your own server. Data is available anywhere, there are no licensing or up-front capital expenses (usage is billed by subscription, per-user), and the risk for managing your applications is transferred to the 2nd party provider. Again, instead of "owning" capability, the small business is "renting" capability, allowing for zero time to maturity and low barriers to entry - out of the gate, the small business can have the same technical capabilities as more mature competitors who paid a premium over the years developing their IT infrastructure.

So the small biz has a lot to consider over the next twelve months - especially if you tack on problems associated with workstation upgrades and Windows 7. Here's a real chance, I think, for competitive advantage: business can either stay the course and own their assets, and pay a premium for similar services that their competitors can acquire at costs up to 50-percent lower; or, businesses can strategically adopt open, hosted, or cloud solutions that take advantage of mobile computing, low licensing and maintenance expenses, and risk transfer; or, the small biz can do nothing - stay put and hope for the best. I think, in today's economy, staying put is exactly the opposite that somebody would want to do, and strategically-applying technology to dramatically reduce costs and liability... in some shape or fashion... would be the better option.

R

Read More