Quantcast
Channel: Application Hosting – aitrusnc
Viewing all articles
Browse latest Browse all 6

Will Cloud Computing be the Sub-Prime of IT?

0
0

Sub-prime lending enabled a great quantity of borrowers, lenders, and investors to participate in markets and transactions which they most often did not fully understand. In many (perhaps most) cases they did not understand the risk involved, did not understand the contracts they entered into, and were completely unprepared when risks became realities.

Similarly, public cloud computing services are enabling great quantities of new customers, service providers, and investors to make low cost entries into complex transactions with many poorly understood or entirely unknown risks.

The often low upfront costs combined with rapid activation processes for public cloud services are enticing to many cost conscious organizations. However, many of these services have complex pay as you go usage rates which can result in surprisingly high fees as the services are rolled out to more users and those services become a key component of the users regularly workflows.

Many public cloud services start out with low introductory rates which go up over time.  The pricing plans rely on the same psychology as introductory cable subscriptions and adjustable rate mortgages.

Additionally, there is often an inexpensive package rate which provides modest service usage allowances. Like many current cell phone data plans, once those usage limits are reached, additional fees automatically accumulate for:

  • CPU usage – sometimes measured in seconds or microseconds per processor core, but often priced as any portion of an hour.
  • Memory usage – measured by the amount of RAM allocated to a server or application instance.
  • Storage usage – usually measured by GB of disk space utilized, but sometimes still priced by the MB.  Sometimes charged even if only allocated but not utilized.
  • data transfer – often measured in GB inbound and/or outbound from the service. Many providers may charge data transfer fees for moving data between server (or service) instances within the same account.
  • IO – these is often nebulous and difficult to estimate in advance.  In the simplest definition, IO stands for Input and Output.  Many technology professionals get mired in long debates about how to measure or forecast IOs and what sort of computer activities should be considered.  It’s a term that is often applied to accessing a disk to load information into memory, or to write information from memory to disk.  If a service plan includes charges for IOs, it’s important the customer understand what they could be charged for.  A misbehaving application, operating systems, or hardware component can cause significant amounts of IO activity very quickly.

User accounts, concurrent user sessions, static IP addresses, data backups, geographic distribution or redundancy, encryption certificates and services, service monitoring, and even usage reporting are some examples of “add-ons” which providers will up sell for additional fees.

It is also common for public cloud service providers to tout a list of high profile clients. It would be a mistake to believe the provider offers the same level of service, support, and security to all of their customers. Amazon, Google, and Microsoft offer their largest customers dedicated facilities with dedicated staff who follow the customer’s approved operational and security procedures. Most customers do not have that kind of purchasing power.  Although the service providers marketing may tout these sort of high profile clients, those customers may well be paying for a Private Cloud.

Private Cloud solutions are typically the current marketing terminology for situations where a customer organization outsources hardware, software, and operations to a third party and contracts the solution as an “Operational Expense” rather than making any upfront “Capital Expenditures” for procurement of assets.

* Op-Ex vs Cap-Ex is often utilized as an accounting gimmick to help a company present favorable financial statements to Wall Street.  There are many ways an organization can abuse this and I’ve seen some doozies. 

Two key attractions for service providers considering a public cloud offering are the Monthly Recurring Charge (MRC) and auto renewing contracts.  The longer a subscriber stays with the service, the more profitable they become for the provider. Service providers can forecast lower future costs due to several factors:

  • Technology products (particularly hard drives, CPUs, memory, and networking gear) continue to get cheaper and faster.
  • The service provider may be able to use open source software for many of the infrastructure services which an Enterprise Organization might have purchased from IBM, Microsoft, or Oracle.  The customer organization could achieve these same savings internally, but is often uncomfortable and unfamiliar with the technologies and unwilling to invest in the workforce development needed to make the transition.
  • The service provider may also be able to utilize volume discounts to procure software licenses at a lower cost then their individual customers could.  For small customer organizations this often holds true.  For larger enterprise organizations this is usually a false selling point as the enterprise should have an internal purchasing department to monitor vendor pricing and negotiate as needed.  Unfortunately many large organizations can be something of a dysfunctional family and there may not be a good relationship between IT, customer business units, and corporate procurement.  Some executives will see outsourcing opportunities as the “easy way out” vs solving internal organizational issues.
  • Off-shore labor pools are continuing to grow both in size and in capability.  Additionally, the current economic circumstances have been holding down first world labor rates.
  • Service Providers can and do resell and out source with other Service Providers.  In the mobile phone industry there are numerous Mobile Virtual Network Operators (MVNOs) who contract for bulk rate services from traditional carriers and then market those services for re-sell under their own branding and pricing plans.  Many cloud service providers have adopted similar business models.

All of these cost factors contribute to the service provider’s ability to develop a compelling business case to its investors.

The subprime market imploded with disastrous consequences when several market conditions changed. New construction saturated many markets and slowed or reversed price trends. Many customers found they couldn’t afford the products and left the market (often thru foreclosures which furthered the oversupply). Many other customers recognized the price increases built into their contracts (variable rate mortgages) and returned to more traditional products (by refinancing to conventional loans). And many sub-prime lenders were found to have engaged in questionable business practices (occasionally fraudulent, often just plain stupid) which eventually forced them out of the business while leaving their customers and investors to clean up the mess.

Like the housing market, public cloud computing is on course to create an oversupply. Many of these cloud providers are signing customers up for contracts and pricing models which will be invalidated in a short time (as processing, storage, and bandwidth continue to get faster and cheaper). And few, if any, of these providers understand the risk environment within which they operate.

Public cloud computing is sure to have a long future for “inherently public” services such as media distribution, entertainment, education, marketing, and social networking.

For personal and organizational computing of “inherently private” data the long value is questionable, and should be questioned.

Current public cloud services offer many customers a cost advantage for CPU processing. It also offers some customers a price advantage for data storage, but few organizations have needs for so called “big data”.  The primary advantage of public cloud services to many organizations is distributed access to shared storage via cheap bandwidth.

Competing on price is always a race to the bottom.  And that is a race very few ever truly win.

Public cloud service providers face significant business risks from price competition and oversupply.  We saw what happened to the IT industry in the early 2000‘s and these were two key factors.

Another factor is declining customer demand.  The capabilities of mobile computing and the capabilities of low cost on-site systems continues to grow rapidly.  In todays pricing, it may be cheaper to host an application in the cloud than to provide enough bandwidth at the corporate office(s) for mobile workers.  That is changing rapidly.

A T1 1.5MB connection used to cost a business several thousand dollars per month.  Now most can get 15MB to 100MB for $79 per month.  As last mile fiber connectivity continues to be deployed, we’ll see many business locations have access to 1GB connections for less than $100 per month.

All of those factors are trumped by one monster of a business risk facing public cloud service providers and customers today.  How should they manage the security of inherently private data.

Many organizations have little to no idea of how to approach data classification, risk assessment, and risk mitigation.   Even the largest organizations of the critical infrastructure industries are struggling with the pace of change, so it’s no surprise that everyone is else behind on this topic.  Additionally, the legal and regulatory systems around the world are still learning how to respond to these topics.

Outsourcing the processing, storage, and/or protection of inherently private data does not relieve an organization from it’s responsibilities to customers, auditors, regulators, investors or other parties who may have a valid interest.

Standards, regulations, and customer expectations are evolving.  What seems reasonable and prudent to an operations manager in a mid-sized organization might appear negligent to an auditor, regulator, or jury.  What seems ok and safe today could have disastrous consequences down the road.

Unless your organization is well versed in data classification and protection, and has the ability to verify a service providers operational practices, I strongly recommend approaching Public Cloud services with extreme caution.

If your organization is not inherently part of the public web services “eco-system”, it would be prudent to restrict your interactions with Public Cloud computing to “inherently public” services such as media distribution, entertainment, education, marketing, and social networking.  At least until the world understands it a bit better.

The costs of processing and storage private data will continue to get cheaper.  If you’re not able to handle your private data needs in house there are still plenty of colocation and hosting services to consider.  But before you start outsourcing, do some thoughtful housekeeping.  Really, if your organization has private data which does not provide enough value to justify in house processing, storage, and protection… please ask yourselves why you even have this data in the first place.


Viewing all articles
Browse latest Browse all 6

Latest Images

Trending Articles





Latest Images