Blog Archives

Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway

Every now and then, comes along a quote which has unforgettable effect on you. The title of this note, is one such quote.

I remember sitting in the lecture room in 2005 at NCST and listening Prof Satam talking about computer networks. Like many other students, we just gaped at the slides on display one after the other and then suddenly in the middle of a saturday afternoon, this hits me. At that time of seeing the slides and the ensuing discussion, I was kinda PPMA (Physically Present but Mentally Absent) with all the numbers and bytes thrown in random going way above than Malcom Marshall’s bouncers. Its only later on, when I read the textbook “Computer Networks” by Andrew S Tanenbaums, did the profoundness of the quote dwelt on me. The wisdom propagated by the author, has had a deep impact on my psyche.

It spelt out the truth that we must always carry when we devise solutions for all kinds of problems. Common Sense is the vital key ingredient that all solutions must have. As technology professionals, there is a perceptible enthusiasm to incorporate newer technology as well as that sizzle that makes it sound like the state of the art cutting edge solution. Often, we get carried away to have that technological solution delivery to our achievements in our kitty when simpler solutions can be found with some plain thinking common sense.

That truth stuck on till date and everytime, I am required to devise a solution, the first thing that hits me is this quote. It forces me to adopt common sense as a platform and not get swept away by technological exuberances.

A brief explanation for folks who are from a non-techie domains:

Suppose you have a large data set that needs to be transferred from one location to another. The technology guy will rattle off statistic like you will need a 8 Mbps leased line and the transfer will take some days to finish ASSUMING the line does not drop etc etc. But with a little common sense, lot more can be achieved with far less too..

This is an excerpt from Tanenbaums book in 1990s; Picture yourself there as you read it..

…. A simple calculation will make this point clear. An industry standard Ultrium tape can hold 200 gigabytes. A box 60 x 60 x 60 cm can hold about 1000 of these tapes, for a total capacity of 200 terabytes, or 1600 terabits (1.6 petabits). A box of tapes can be delivered anywhere in the United States in 24 hours by Federal Express and other companies. The effective bandwidth of this transmission is 1600 terabits/86,400 sec, or 19Gbps. If the destination is only an hour away by road, the bandwidth is increased to over 400Gbps. No computer network can even approach this.

For a bank with many gigabytes of data to be backed up daily on a second machine (so the bank can continue to function even in the face of a major flood or earthquake), it is likely that no other transmission technology can even begin to approach magnetic tape for performance. Of course, networks are getting faster, but tape densities are increasing, too.

If we now look at cost, we get a similar picture. The cost of an Ultrium tape is around $40 when bought in bulk. A tape can be reused at least ten times, so the tape cost is maybe $4000 per box per usage. Add to this another $1000 for shipping (probably much less), and we have a cost of roughly $5000 to ship 200TB. This amounts to shipping a gigabyte for under 3 cents. No network can beat that.

The moral of the story is:

Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway.

Cloud Computing: The Next Classic Disruptive Technology

An interesting article on Clod Computing and its current prospects

Cloud Computing: The Next Classic Disruptive Technology

Global technology companies like Microsoft and SAP are investing heavily in developing subscription-based model for their products that can be provisioned from a cloud. Companies who are into hardware as well as in software business like HP and IBM are expected to offer Cloud Computing-based services, as per the expert. He also recommends services firms like Accenture, Capgemini, TCS, Infosys to provide services using this new model
The typical way applications are designed and implemented is to purchase computer hardware and install the applications on servers in a data center to run the applications. Critical applications are hosted on dedicated server farms and in-house teams who built the applications have full ownership of their systems and are responsible for uptime, upgrades, load balancing, bcp, etc. The capacity was designed for peak load (which occurs a few times a day at best) and hence the average utilization of the servers is extremely low (on average 8 percent to 10 percent of server capacity is used). All this adds up to a high upfront investment and ongoing operational expenses. Security, DR/Business Continuity plans have to be in place and maintained at all times. These costs are very high for mission critical applications which require a dedicated backup site such as found in the DR plans of many large financial institutions.
Technology Evolution: The last decade has seen a tremendous evolution on the technology front with hardware commoditization, maturity of the Internet as a platform, virtualization, open source, Rich Internet Applications (RIA), Software as a Service (SaaS), improved security products and processes. These technology trends are revolutionizing IT departments by separating computing power from hardware. Virtualization allows servers to be split into multiple “virtual machines” where each virtual instance can run it own applications — IT departments need not add new machines each time a new application is needed or to cater to peak loads. RIA supports desktop like functionality within a browser and SaaS allows access to application when needed on a subscription basis. These trends along with the increased focus on Green IT and overall cost reduction policies have lead to IT departments looking at an utility-based model with the ability to tap into computing power on the Web and pay for the computing power consumed.
What is Cloud Computing? The term Cloud Computing probably comes from (at least partly) the use of a cloud image to represent the Internet or some large networked environment. Simply put, computing is done in a remote location and applications tap into that computing power over an Internet connection from any type of connected device. Cloud Computing is a potentially cost-efficient model for provisioning processes, applications and services (computation services, storage services, networking services and such) and delivered on demand regardless of where the user is or the type of device they’re using. In this emerging computing model, users access their applications from anywhere through any connected device and pay by usage. The applications reside on scalable remote data centers and computational resources can be dynamically provisioned and are shared to achieve economies of scale. Users are abstracted from the complexity of the underlying infrastructure.
Is Cloud Computing the ultimate form of globalization presenting new opportunities for services firms? Cloud Computing represents a virtualized computing power — provisioning processes, applications and services regardless of where the user is and where the computing power is delivered from (with Internet as delivery platform). Cloud Computing will help Small to Medium-sized Enterprises’ (SMEs) world over particularly from developing countries access world class applications and services without large investments in IT infrastructure. It also opens up application developers in developing countries to build applications and distribute them on the Cloud Computing platform at minimal cost and compete with the best — an option not available until this point. Cloud Computing will clearly change the economics of the business and services firms, which will need to adapt to the platform shift.

Firms specializing in developing traditional software or designing business applications will come under increasing pressure. These firms sell their applications charging an upfront license fee and an annual maintenance free for upgrades and support. The biggest challenge for such firms will be to become cloud suppliers and firms like Microsoft and SAP are investing heavily in developing subscription-based model for their products that can be provisioned from a cloud.
Service providers that rely on large implementations of ERP and other enterprise systems will need to adapt to provide solutions in a “pay as you go model”. Firms need to choose to be a cloud service provider. Firms such as HP and IBM already sell both hardware and IT services will try to do both. The services firms like Accenture, Capgemini, TCS, Infosys should adapt to provision services using this new model.
The main clientele for the large services firms are Fortune 500 business such as Wall Street banks, credit card companies, insurers and such. The shift created is retargeting the IT supported business-process services toward the mid-market. Mid-market firms typically find it very expensive to invest in high-availability, high-security enterprise systems. This is an opportunity for Cloud Computing service providers to provide a truly disruptive technology since it brings a whole new group of customers into the market.
Firms in developing countries are a pristine customer base for the service providers using the Cloud Computing “pay as you go model”. This brings new clients in new geographies that typically could not afford the upfront investments to build a full scale IT infrastructure as the pay as you go model reduces capex.
Who are the key vendors in this space?

The list is a sample of vendors in the space and the core offering — backed by venture capital funding it is expected that cloud computing space will see innovative products to make this space secure and enterprise ready.

Cloud Providers / Platforms
Amazon Infrastructure Cloud: Provides resizable compute capacity (EC2), a web service that provides core database services, storage (S3), Queue service, etc.
Google App Engine: Allows running of Web applications on Google’s infrastructure.
IBM Blue Cloud: Blue Cloud meant to run large-scale applications with massive databases over the Internet.
Microsoft Azure: A cloud services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services.
Rackspace: Offers computing service through its subsidiary Mosso. Priced higher than Amazon but much cheaper than a dedicated server.
Vmware vCloud: VMware has partnered with a number of hosting and cloud computing vendors to enable delivery on a common VMware platform. This gives users choice of where they deploy applications and allows easy transitions between providers, as well as on and off premise use.
Force.com: Force.com is Cloud Computing for the Enterprise with 13 software applications on Force.com.
Sun Microsystems: Sun is planning a comprehensive cloud platform including mySQL database and Q-layer (Technology Simplifies Cloud Computing Development and Deployment).
EMC: EMC will offer storage and sharing services and has made 2 acquisitions in this space (Mozy and Pi).
Cloud Applications / Products / Solutions
GoogleDocs: It is an application offering web based word processor, spreadsheet, etc created, edited, shared, opened, and also edited by multiple users at the same time
Morph: Provider of on-demand application delivery platforms, managed services and end-user applications using the Amazon Web Services platform
Zoho: A comprehensive suite of 18 web-based programs for small businesses
Zuora/PayPal: Zuora hosted billing platform using PayPal’s SDK and APIs to tie into the PayPal billing engine.
ADP: Hosted HR/Payroll solution
Amex/Concur: Amex owns an equity in Concur which offers expense management as a hosted model
Salesforce.com: Built on the Force.com platform, Salesforce CRM claims to be the fastest, most flexible CRM service on the market
NetSuite: NetSuite OneWorld is a cloud computing solution which enables multi-national and multi-subsidiary companies to manage their global business operations in real-time.

“Private Cloud” vs. “Public Cloud”: It is important to note that it is unlikely that large enterprises will provision all their applications from “public” clouds. Whilst there will be a large number of public clouds that enterprises can leverage, most likely a number of “private or specialty” clouds will be created to cater to the needs of a certain sector or industry or by a cloud supplier for their customers or for an organization. Private clouds offer greater control and security over applications and data.
For example, a services firm, in collaboration with a cloud infrastructure provider, can create a “private” cloud with a HR outsourcing platform. This platform should be made complaint with the all the relevant regulations, incorporate the requisite security and data privacy features. The platform should support “multitenancy” — the ability to support multiple end clients without compromising on security, privacy, quality of service and such. The services provider is responsible for the implementation, security and ongoing support and maintenance. The infrastructure provider will ensure computer systems availability including load balancing, backups and archiving, bcp and disaster recovery plans and invoices the service vendor by usage (storage, number of instance of the application, and such). The customers are invoiced on a pay as you use model (for example, the number of employees in the HR system). Customers can now forecast and budget their HR expenses based on their growth plans without the variability of IT infrastructure and overhead costs (In essence a capex has turned into an opex). The service providers can leverage this platform to market their HR solution to other clients including the SME sector.
Service providers should look at a key aspect of the cloud data centers called “multi-tenancy.” Computing tasks being done for different individuals or companies are all handled on the same set of computers. As a result, more of the available computing power is being used at any given time. Rather than have dedicated set of computer hardware and software per client per application, they should look at hosting solutions on the same set of systems guarantying the same level of security, capacity, load balancing, bcp, etc. Salesforce.com and NetSuite are clear leaders in this space whilst SAP, Oracle and Microsoft are positioning their products to be cloud-ready. Firms specializing in BPO should look at platforms that are multi-tenant ready as clients are going to demand cost benefits in future contracts. As an example, Wall Street Systems is offering an on demand processing capacity and pay as you go pricing for the financial services firms and is a post trade processing utility for the capital markets.
Early Successes and Challenges: Most Cloud providers like Amazon, at this time, cater to startups and non critical applications. However, a number of service providers are positioning their products to be cloud ready including IBM’s DB2, and such. As listed previously, a large number of niche firms, backed by venture capital funding are developing innovative solutions to make cloud computing enterprise ready.

Some examples of early movers in the space:
Nasdaq has used a cloud computing provider to provide historical stock market information called Market Replay.
A senior engineer of The Washington Post used Amazon Web Service to turn 17,000 pages of information from a non searchable PDF into a searchable database in about 26 hours using 1,407 hours of virtual machine time at a final expense of $144.62.
Engineers at Kenworth, a midsized truck maker rented time on IBM computers to simulate tests and remove any design flaws to improve gas efficiency of their trucks (Exa Corporation sold metered access to a cluster of IBM computers with enough speed for their design simulation). The internal computers at Kenworth weren’t powerful enough to closely estimate the air flow conditions around a truck travelling at a certain speed.
Some enterprise customers are planning to evolve their datacenters to “private” clouds. BMI, a UK based airline, has plans to a significant parts of its infrastructure onto a VMware-based private cloud delivered by third party the U.K.-based service provider.
An enterprise, planning to foray into this space, needs to be cognizant of the key challenges:

Migrating existing enterprise applications and integration across multiple applications

Migrating from an enterprise in house application to a cloud requires a lot of effort.
Currently there is integration across applications in an enterprise — HR application to payroll, trading systems and compliance application, ERP and accounting systems, and such. Interoperability across clouds will be a challenge.
Risk: Legal, regulatory, and business

U.S. publicly traded companies have to be SOX complaint and depending upon the industry a company is in, there may be industry-specific regulations like HIPAA in health care.
Amazon and Google services have had outages recently increasing business risk.
It remains to be seen how governments begin to regulate the platform for security and other data related issues.
Customer should audit cloud providers and suppliers as they would any other vendor for all the legal and regulatory requirements. Certain European nations mandate that information must be kept within the borders of the nation. All the cloud providers and suppliers understand the regulations but will need to be audited for compliance.
Difficulty of managing cloud applications

All of the cloud providers offer tools to manage systems running in their environments and there are startups that provide even more sophisticated tools to manage some cloud environments. Currently there are no system management tools to manage a mixed environment that incorporates existing data centers as well as a cloud environment.
Too early to establish the cost advantage for cloud computing

Since the industry is still in its infancy, it is difficult to establish clearly the cost advantage using cloud computing platforms. However over time, it is widely believed that this platform will lead to substantial savings.
For the service providers, pricing using the new model will be challenging initially.
Security of confidential data stored in a cloud
Security models and standards are yet to emerge.
Recommendation: Most business should not relinquish control of their critical data and all of their computing resources to a cloud provider but it is important that they start planning beginning with the non critical applications and processes that are mature from an outsourcing perspective (e.g. HR, F &A, etc). The IT and BPO service providers should design solutions based on a “private” cloud computing model that incorporates subscription or utility based pricing. We certainly do not expect the in house enterprise implementations and IT departments to disappear in the near future, service providers will need to cater to both — a pay as you use model and the traditional model that we are currently most familiar with. Finally, to the service providers, a new clustomer base from developing countries and the SME’s segment is a great incentive to invest in the Cloud Computing platform.
URL for this article:
http://www.globalservicesmedia.com/IT-Outsourcing/Infrastructure-Management/Cloud-Computing:-The-Next-Classic-Disruptive-Technology/22/6/0/general200903166148

%d bloggers like this: