Use of Cloud Computing and Virtualization in the Time of Recession

Cloud Computing on Ulitzer

Subscribe to Cloud Computing on Ulitzer: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Cloud Computing on Ulitzer: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Cloud Computing Authors: Pat Romanski, Liz McMillan, Astadia CloudGPS, Elizabeth White, Rene Buest

Related Topics: Cloud Computing

Cloud Computing: Blog Feed Post

Value of Market Data in the Cloud--Comparing NASDAQ and NYSE Strategies

In April 2011 during my keynote presentation at Financial Technology Forums Wall Street Breakfast Briefing, I cited Xignite as a building block, or participant in a Cloud web services ecosystem to illustrate how the Network Effect of web services and data in the Cloud will become more important than any other single consideration or success factor in the Cloud.
During that presentation to Wall Street, one of the participants asked me if I had heard of the NYSE Community Cloud. I had not.
Beyond the "low latency / high frequency" use case, the real driver of where applications run and how they are designed is the data. If data is in the Cloud, it exerts a force, like matter, on everything within its field. And in the case of the Cloud, the presence of Big Data, such as the giant data sets maintained by Xignite and NASDAQ on Amazon AWS, this data draws still more web services and still more data to the AWS Cloud. The "closer" a service is to the data, the better the application performs, the better it utilizes the network, and the more effectively the application can scale.

Metcalfe's Law
According to Metcalfe the value of a network is the square of the number of connections. In this sense it seems that "closing" a network severely limits the number of connections. How does this impact the economics of the Cloud?
I concur with Reed that while a compelling and ready starting point Metcalfe's Law applied to the internet does not begin to express the dramatic force of scale. For a lively discussion please see:
Metcalfe's Law is Wrong - IEEE Spectrum
It's All In Your Head - ( by Robert M. Metcalfe)

Network Effect
[Wikipedia Definition]
In economics and business, a network effect (also called network externality or demand-side economies of scale) is the effect that one user of a good or service has on the value of that product to other people. When network effect is present, the value of a product or service is dependent on the number of others using it.

From Robert M. Metcalfe's article It's in Your Head, Forbes 05.05.07

Comparison of NASDAQ and NYSE Market Data Cloud Strategy

  • Leverage existing services available in the Public Cloud, thereby accelerating time to market and minimizing costs while maximizing customer value by leveraging Xignite's proven successful platform 
  • Build NASDAQ Data On Demand on Xignite platform in Amazon AWS Cloud.
  • NASDAQ data + Xignite Platform + Xignite Catalog of Market Data + accelerate an ecosystem of market data value multiplying web services = much greater than the sum of its parts

  • Build a Community Cloud on VMware / EMC technology ensuring vendor accountability and presuming a high degree of control and so forth.
  • Build a data platform?, 
  • Benefit from other web services running in the community cloud? 
  • Maintain tight control and focus on a particular industry and provide very low latency for those firms who can leverage it or that prefer it. 
[While technically I think the NYSE Community Cloud approach will succeed, I think the real value that is lost is what is lost through exclusion. The Network Effect in the "Closed" Community cloud upon the value of the NYSE data and applications will be muted. The universe of web services and complementary data in a community cloud (as modeled by Metcalfe or Reed) will not thrive as it would in a Public Cloud]

So in the case of this financial services "community cloud," the logic of colocating in this Cloud might be:
1) Low latency. If locating in this Cloud enables firms to provision low latency applications focused on the NYSE, then this might be a reason to locate there.
2) Big Data. Applications for "backtesting" algorithmic trading strategies use large amounts of historical data to "test" the algorithms. In contrast, NASDAQ chose to partner and build its Data On Demand service in the Amazon AWS Cloud based on Xignite's incredibly well-engineered web service api and market data platform. 

What's the difference? One key difference is time to market. The Euronext solution is still in Beta and has limited participation. By leveraging the Xignite platform, NASDAQ has build a profitable new line of business on top of its exchange. Moreover, by building on a successful platform, NASDAQ is able to maximize the value of its data, minimize the cost of providing the service, and provide the core data that will become the foundation for a rich ecosystem of web services already available, or that will be written to leverage this data. By building on an open platform, NASDAQ data will vastly multiply the value of the data because the real value of data in a network is the degrees to which it can be shared. 

In this case we see how in the Cloud, the real value of the Cloud isn't even just the server TCO, but a much more powerful Network Effect upon the information and services in the Public Cloud. For this reason, I don't discourage firms from building Private or Community Clouds, but I can't help feel it's important to underscore the Network Effect of the Public Cloud data and web services and how this could dramatically alter the economics and outcomes for certain firms

I think a lot of firms recognize the value of these large data sets for use cases such as algorithmic trading system back-testing. And I think you identify a key force in Cloud Economics when you name the types of powerful applications that can use large data sets if the data and the applications are "close" enough so that the latency of moving this data doesn't hobble the application.
The question I have is regarding the value of the data in a specific environment.
1) If the size of data is like a "Mass" in physics, and exerts a certain "force" then I think data will tend to "attract' algorithms / web services in the way that mass exerts a gravitational force on matter.
2) If a big data set is located in a particular data center, then I think it's likely web services that consume and possibly add value to that data will also be located in that data center.
3) The decision as to where to locate data should be a function of maximizing the value of the data.
4) The greater the availability of a dataset to those who could potentially add value to the data, the more likely participants in the network will identify and build value added services which consume the data.
5) As a result, I think that the more a Cloud encourages open, low cost, vibrant participation, the more compelling the business case for content providers to locate data in that ecosystem. And as a result, the Network Effect quickly determines a highly skewed reward system for participating in the Cloud with the most data and the most participants.

NYSE Builds a Specialty Cloud for Financial Markets 
Fountainhead: Real-World Financial Services Cloud

[Wikipedia Links to Network Effect Resources]

Read the original blog entry...

More Stories By Brian McCallion

Brian McCallion Bronze Drum works with executives to develop Cloud Strategy, Big Data proof-of-concepts, and trains enterprise teams to rethink process and operations. Focus areas include: Enterprise Cloud Strategy and Project Management Cloud Data Governance and Compliance Infrastructure Automation