The Enterprise Public Cloud Wars Are Over. Let’s Talk About What’s Next


Last week there were some very interesting developments in the public cloud space. First, Microsoft, Amazon beat Wall Street expectations and reported monster numbers associated with their public cloud offerings.

Also, HP announced that it will be stopping the efforts about the Helion public cloud to focus on the private and hybrid cloud offerings. These news come after two years of sustained efforts and large investments in their public cloud platform.

These new developments has made many questioned whether the enterprise public cloud race is essentially over. While the public cloud space remains really competitive within a few platforms, it’s pretty obvious that we can already identify winners in the enterprise public cloud space.

And the Winners Are….

Amazon and Microsoft by a mile with Google, Salesforce and IBM being relevant second places. Amazon’s AWS and Microsoft’s Azure have managed to develop the most complete offerings in the enterprise public cloud space. Today, AWS and Azure not only provide a larger of number of capabilities than their competitors but also more complete offerings enterprise-ready areas such as security, compliance etc.

Playing the Innovator’s Dilemma

If we apply Clayton M. Christensen’s innovator dilemma thesis to the enterprise public cloud space we need to assume that, at some point, a new generation of startups will challenge the well-established enterprise public cloud platforms. However, looking at the near future, it’s hard to imagine what factors will be able to disrupt the lead established by the enterprise public cloud incumbents. In principle, a new entrant in the enterprise public cloud space can try to establish traction by disrupting some of the following areas:

  • Technology Capabilities: The simplest way to disrupt the enterprise public cloud space will be with a new offer that provides a richer feature set than the existing incumbents.
  • Pricing: Price is always an interesting factor to facilitate adoption for a new entrant in the enterprise public cloud space.
  • Developer Community: Developers have proven that they can drive the adoption of cloud technologies in the enterprise
  • Adoption within Enterprise Software Startups: A cloud platform that gets adopted by new enterprise software startups can become increasingly relevant customers of those startups.
  • Traction in Specific Geographies: Conceivably, a new enterprise cloud platform can gain traction within a specific geographic location.
  • New Distribution Channels: If a new public cloud platform can tap into distribution channels untapped by the incumbents in the space, it can become relevant in the enterprise.

However, after analyzing the current state of the enterprise public cloud platform market, its hard to imagine how any of those factors can be relevant enough for a new entrant to cause disruption in the space. Let’s explore that analysis.

Technology Innovation

From the enterprise capability standpoint, the Azure and AWS cloud are far ahead of every other platform in the space. Ranging from basic infrastructure to sophisticated capabilities such as machine learning or integration, both Azure and AWS are providing native services for every imaginable capability that can be relevant in the enterprise. Salesforce, IBM and Google are also rapidly growing the feature set of their public cloud offerings. In that sense, it’s hard to conceive a new enterprise public cloud platform that will be able to compete with the incumbents in the short term.


The enterprise public cloud offering is a race down to 0. Amazon, Microsoft and Google keep lowering the price of their offering in order to not allow any other vendor to gain competitive advantage. In that sense, it will be incredibly difficult for a new enterprise public cloud offering to disrupt the space with a more attractive pricing model.

Developer Community

AWS, Azure, Heroku and Google Cloud are some of the examples of enterprise public cloud platforms that are enjoying growing developer communities. By leveraging and embracing open source technologies, incumbents have been able to attract millions of developers actively building applications in their platforms. As a result, new platforms in the space will have to build similar developer communities to even be competitive with the incumbents.

Enterprise Software Startups

Another element that seems to be a non-factor. Platforms like AWS, Azure and Google Cloud have not only become dominant offers for large enterprises but they have captured the hearts of new enterprise software startups that keep relying on those platforms to power their offerings. Those startups represent a new and vibrant distribution channels for the adoption of the incumbent platforms and will make it extremely hard for a new offering in the space to leverage that channel.

Geographical Presence

Platforms like AWS and Azure have developed an incredibly impressive global footprint with locations that offer viable options in almost every country in the world. Additionally, the services provided by those platforms are available in a large number of languages. This factor makes it almost impossible for new entrants to develop a relevant presence in specific geographies.

New Distribution Channels

The dominant enterprise public cloud platforms have played a masterful game developing a large number of distribution channels from partner networks to whitelabel offerings. The network effects of these channels will make it extremely hard to new offerings to disrupt the enterprise public cloud space by developing new distribution channels.

Let’s Focus on Private and Hybrid Clouds

With the competition in the enterprise public cloud space almost over, the attention has shifted to private and hybrid cloud platforms. Currently, that space remains overly competitive with platforms like Pivotal’s Cloudfoundry, HP Helion, Aprenda and even public cloud incumbents like Azure and AWS trying to create comprehensive offers for enterprises. That will be the subject of a future post.

Leave a comment

Posted by on November 6, 2015 in Uncategorized


Tags: , , , , , ,

IBM Watson and Cognitive Data as the Future of Information Data Systems


IBM Watson is slowly becoming an important piece about IBM’s vision of the future of computing. Yesterday, big blue announced that is launching another business unit centered on Watson solutions. The investment in this new unit is estimated to be around $1B but, more importantly, it reinforces IBM’s commitment to Watson and cognitive data as the future of enterprise data solutions.

A lot has happened since Watson made news winning the television quiz show Jeopardy by beating legends Ken Jennings and Brad Rutter. At the time, Watson was a sophisticated natural language processing machine but didn’t have a lot to offer in other areas of cognitive computing.

Since the jeopardy days, Watson has added a significant number of services in areas such as data insights, vision processing, image recognition, natural language processing, text analytics and other important areas of cognitive science. More importantly, IBM is making Watson available through the series of APIs via the Watson Developer Cloud which allow developers to leverage Watson is third party applications.

IBM’s efforts around Watson are, undoubtedly, the most important steps to establish cognitive data as a mainstream trend in the technology arena. While big data technologies have certainly disrupted the information management space, data processing applications remain mostly ignorant when comes to understanding and reasoning through the data they store. This is where cognitive data becomes important by helping expert systems enhance, understand and reason through structured and unstructured data sets in order to make intelligent decisions.

5 Reasons Why Cognitive Data is the Future of Enterprise Data

Data is Becoming Contextual in Nature

Modern data is becoming more contextual every day. While data sets can be considered static in nature, they have different interpretations depending on contextual aspects such as time, location, environmental aspects, etc. Cognitive computing is a necessary step to make information systems more context aware by augmenting static data sources with dynamic contextual data and reason and learn from it.

Big Data is Just a Lot of Dumb Data

Today, big data systems are becoming an important element of software systems by storing large amounts of static data. Despite the advances in data storage and process, data systems remain essentially unintelligent when comes to understand, optimize, augment and reason through the data they store. In that sense, organizations are constantly building new systems to make data “more intelligent”. Cognitive data presents a powerful alternative to traditional data systems by providing a layer of intelligence to modern information systems.

Data Scientists are not for all Scenarios

Data scientists are the most common answer when comes to gather insights about specific data sets. However, data scientists are fundamentally inefficient in areas such as real time vision analysis, image recognition, speech analysis and other fundamental aspects of cognitive systems. Cognitive data and platforms like IBM Watson will help to expand the capabilities of traditional data science to provide more sophisticated intelligence over traditional data sources.

Video, Images, Text and Speech are Becoming Increasingly Important

Complementing the previous point, data signals such as video, text, images and speech are fundamentally difficult to process by traditional data systems. Platforms like IBM Watson and other cognitive data solutions excel at the understanding and processing of these type of data points making an ideal extension of traditional data systems

Actions are as important as Data Insights

In modern data systems, actions related to the data are typically hardcoded as a bunch of rules within an applications. However, automatically taking actions based on data insights is becoming an increasingly important aspect of modern applications. Cognitive data is a fundamental step towards enabling intelligent decision making based on data insights on software applications.

5 Cognitive Data Scenarios Relevant in Today’s Enterprise


Cognitive science is starting to revolutionize healthcare.  The intelligent processing of unstructured healthcare data such as images, videos, speech etc is leading the charge in modern healthcare applications ranging from treatment recommendations to decease pattern analysis. Not surprisingly, healthcare remains the number one vertical for IBM Watson applications.

Public Safety

Cognitive data can help better reason through real data points in the form of video, images, sounds and text commonly encountered in public safety scenarios. Using cognitive data systems, public safety operators can improve their decision making process by interacting with systems that will help them reason through contextual data in their environments.


From fraud detection to financial package recommendations, cognitive data is increasingly becoming relevant in financial systems. Reasoning through large amounts of semi-structured and unstructured data, cognitive data systems can help improve financial decisions such as trading, fraud analysis, etc.


Cognitive data is a key element of the future of recommendation systems and other user engagement marketing processes. Rapidly reasoning through the text on an email or the tone on a phone call, will help organizations to recommend better products to their customers while also enhancing the understanding of their marketing data.


This is an obvious one. Cognitive data will be essential to improve defense operations by better reasoning through the millions of data signals collected by soldiers and equipment on the field. Additional, cognitive science will help to build more intelligent defense equipment such as drones or robots that are becoming an integral part of modern warfare.

Leave a comment

Posted by on October 14, 2015 in Uncategorized


Tags: , , , ,

Technologies That Will Redefine the Enterprise: Virtual and Augmented Reality


This is the second article of a series attempting to evaluate technologies that lead the next wave of innovation in enteroprise software. The list is based on a recent article published an article on ReadWrite about trends that are destined to change the future of enterprise software. The article includes 14 trends that we consider relevant consider its maturity in the industry.

  • enterprise hardware
  • the industrial Internet of Things
  • applications powered by the blockchain
  • proactive analytics
  • 3D printing
  • enterprise marketplaces
  • domain-specific data science
  • augmented reality in industrial settings
  • mainstream machine learning
  • drone platforms
  • next-generation cybersecurity
  • platforms for microservices
  • the Docker ecosystem
  • new application-development platforms for the enterprise

Today we would like two revolutionary movements that are typically not associated with enterprise software: virtual and augmented reality

Virtual and Augmented Reality in Enterprise Settings

Virtual and augmented reality are called to dominate the next wave of innovation in user experience technologies. While virtual and augmented reality technologies are typically associated with consumer centric solutions such as gaming, they can have a profound impact in the next generation of enterprise software solutions. In that sense, is not surprising that platforms like Windows Holographic or Facebook’s Oculus are already being pilot in several vertical solutions.

From the capability standpoint, modern virtual and augmented reality solutions offers new possibility to create, interact and visualize objects and data in brand new user experiences.

Virtual vs. Augmented Reality

Virtual reality puts you entirely in a computer-generated world. You’re cut off from reality inside a virtual reality device. Players like Facebook’s Oculus and the HTC Vive, which uses technology from game tech company Valve, are some of the main players in the space.

Augmented reality superimposes digital information over the real world. You usually experience it in the form of glasses that let most of the real world in, but then beam some images into your eyeballs so they appear to be floating in space. Leaders here include Microsoft’s HoloLens, and the mysterious startup Magic Leap, which received a $542 million investment led by Google last year.

5 Industries Likely to Rapidly Embrace Virtual and Augmented Reality Technologies


Augmented and virtual reality technologies can completely disrupt the classroom experience in education institutions. In that sense, we can envision students experimenting a specific lessons in a virtual world in which they can conduct exercises and see and experiment with the result. Additionally, these type of technologies opens new possibilities to interact with data and collaborate with professors and students.


Constructions is one of those industries in which there is a large gap between the creation and the final result. While architects and engineers use computer systems such as AutoCAD to create specific construction plans the materialization of those designs takes place at the construction site involving dozens of companies which communicate exchanging paper. Virtual and augmented reality opens new horizons to bring the creation of architects and engineers closer to the physical world. Additionally, it offers new ways for companies to collaborate at a job site seeing the direct impact of their work.

Public Safety

Police, firefighters and other public safety institutions operate on high-risk environments depending on real time access to critical information. Augmented reality technologies offers the foundation to allow public safety operatives to interact with information in the context of a real world environment while providing new ways to collaborate with other individuals involved in public safety operation.


Sports, medical, military, defense, etc are just some of the industries that rely on continuous training to simulate real time conditions. In those environments, the creation of training models is a long an expensive exercise and one in which mistakes can have critical consequences. Using virtual and augmented reality technologies will drastically simplify the authoring of training exercises that simulate real time conditions and allow students practice and evaluate different situations without any major risk. Additionally, these type of platforms provides new ways to evaluate the performances of industry professionals and identify mechanisms to improve their results.


Virtual and augmented reality platforms have the opportunity to change the in-store experience and provide new digital models for consumers to interact with retailers. Today, there is a marked difference between the ecommerce-digital and in-store experience to shopping. Virtual and augmented reality technologies can help to bridge that gap that creating digital environments that resemble n-store experiences in which users need to try or interact with different items while also having access to complementary information that can improve their shopping experience.

Leave a comment

Posted by on September 3, 2015 in Uncategorized


Tags: , , , ,

Beyond .NET and J2EE: The Emergence of a Third Application Development Platform in the Enterprise


A few days ago, I published an article on ReadWrite about trends that are destined to change the future of enterprise software. The article includes 14 trends that we consider relevant consider its maturity in the industry.

  • enterprise hardware
  • the industrial Internet of Things
  • applications powered by the blockchain
  • proactive analytics
  • 3D printing
  • enterprise marketplaces
  • domain-specific data science
  • augmented reality in industrial settings
  • mainstream machine learning
  • drone platforms
  • next-generation cybersecurity
  • platforms for microservices
  • the Docker ecosystem
  • new application-development platforms for the enterprise

The feedback from the article has been awesome including a few people that didn’t miss the opportunity to remind me other important trends that I missed or questioning some of the included ones ;) In that sense, I thought it would be a good idea to expand on some of the trends highlighted in the article and provide more context about why I think they will be foundational to the next generation of enterprise software solutions.

Let’s start with a trend that I think is long overdue in enterprise software:

The Need for a Third Application Development Platform for the Enterprise

For the last fifteen years, the enterprise IT space has relied on two main application development platforms: .NET and J2EE. While other platforms like Ruby on Rails or Python has certainly gained some adoption in the enterprise, their market share remains relatively small compared to the adoption of .NET and J2EE. After almost 2 decades of developing solutions almost exclusively in two platforms, there are a number of factors conspiring to facilitate the emergence of a third application development platform in the enterprise.

Despite the numerous innovations in the .NET and J2EE platforms, their dominance in the enterprise IT space can be partly attributed to its commercial channels. For the last 2 decades, the combination of Microsoft and J2EE vendors like IBM, Oracle, Tibco, etc accounted for a large percentage of enterprise IT deals. However, many of the factors that established the dominance of .NET and J2EE have either disappeared or changed and, at this point, I believe enterprise IT can benefit from the emergence of a third enterprise-ready application development platform.

Better Application Models for the Cloud

Today, many enterprise IT applications are being developed using cloud platforms such as Azure, Bluemix or AWS. In those infrastructures, the level of support for new application development platforms like NodeJS, Python or Ruby is as good, if not sometimes better than J2EE and .NET. This level of support removes some of the concerns in terms of enterprise-ready tooling that has traditionally blocked open source application development platforms from entering the enterprise.

Optimized for Mobile Applications

Mobile application development is becoming a relevant item in any CIO’s agenda. In the mobile space, platforms like NodeJS have become the platform of choice for enabling backend APIs used by mobile applications. In that sense, many organizations building mobile applications or using mobile platforms are already leveraging platforms like NodeJS instead of traditional J2EE or .NET stacks.

The Emergence of Enterprise Open Source

Open source technologies are becoming more prevalent in the enterprise. Movements like the big data platforms or mobile application development stacks are vastly dominated by open source solutions. Typically, open source server stacks provide a great support for platforms like NodeJS, Python or Ruby in the form or SDKs, samples etc. Consequently, as more organizations embrace open source server platforms they are likely to leverage technologies other than .NET or J2EE for building applications in that platform.

A New Generation of Developers and Professional Services Agencies

As application development stacks like NodeJS and Python continue gaining momentum with the developer communities, more and more developers are likely to favor those stacks instead of traditional .NET or J2EE platforms. Is not a surprise that many of the modern software development agencies are actively hiring developers with skills in prominent open source application development platforms like NodeJS, Ruby or Python. Those agencies are actively evangelizing the benefits of those platforms in enterprise IT settings and playing and important role in the adoption of those new application development platforms in the enterprise.

The Enterprise Software Startup Channel

The explosion of innovation in the enterprise software startup scene is forcing big organizations to start embracing technologies from early stage startups in order to stay competitive. Many of the most innovative enterprise software startup platforms leverage application development stacks other than .NET and J2EE. As a result, many enterprise IT organizations are starting to indirectly leverage those platforms as part of broader enterprise software solutions.

Is NodeJS the One?

Without getting into predictions, is hard to talk about the emergence of a third application development platform without talking about potential candidates. From the existing platforms in the market, NodeJS seems to have all the ingredients to become relevant in the enterprise.

Today, NodeJS enjoys a vibrant development community and is the platform of choice of many enterprise software startups. Additionally, NodeJS is widely supported by all enterprise cloud and mobile platforms and is being slowly adopted by some of the top professional services agencies in the world.

While getting to the level of dominance that .NET and J2EE enjoy in today’s enterprise IT environment is going to require more than the aforementioned factors, I believe NodeJS has a very strong opportunity to become a third application development platform in the enterprise.

Leave a comment

Posted by on August 19, 2015 in Uncategorized


Tags: , , , , , ,

6 Best Practices of Successful Enterprise Data Science Projects


During the last decade, data science projects in the enterprise have developed a reputation for being complex and expensive. However, the last few years have seen an explosion in new machine learning and big data infrastructure technologies that have helped lower the entry point for implementing data science solutions in the enterprise. Despite the technical evolution, enterprise data science projects remain relatively complex compared to traditional areas of investment in enterprise IT.

Similar to other groundbreaking technologies in enterprise IT, implementing successful data science solutions is a combination of strong processes, delivery methodologies and technologies. Our experience implementing dozens of successful enterprise data science and machine learning solutions have allowed us to develop certain perspective about patterns we think help to optimize the success of data science projects in the enterprise. The following list provides a small summary of best practices in enterprise data science projects. Some of them might seem trivial but they can be difficult to enforce in real world implementations.

Build For the Future: Build on Technologies You Can Innovate Upon

Data science platforms is one of the fastest growing areas in the technology ecosystem. As a result, new platforms, machine learning algorithms, data visualization technologies, etc are constantly surging bringing new value propositions to enterprise solutions. Additionally, the requirements for enterprise data science solutions are constantly changing based on new market trends.

Building on a technology stack that facilitates innovation, extensibility and scalability is essential to guarantee the success of enterprise data science projects. In that sense, when selecting a data science platform, organizations should not only evaluate its technical capabilities but also complementary factors such as developer community, open source contributions, talent availability etc.

No Model is Right: Implement Various Models for the Same Scenario

One of the most common mistakes in machine learning projects is deciding on a specific prediction or classification algorithm before implementing the solution. Many times, the optimal algorithm is not discovered until several models are tested and evaluated with the real data. In that sense, is a good practice to implement the first iteration of the solution running several machine learning algorithms concurrently and compare the results over time.

Continuous Data Science: Deliver Results Every Week and the First MVP in a Month

Enterprise data science projects are notorious for taking a long time and being extremely expensive. Also, is not uncommon that stakeholders need to wait months before seeing the first results of a data science solution which, more often than not, need to be improved. To mitigate some of those challenges, we always recommend structuring projects in a way that deliver weekly results to stakeholders.

In addition to deliver weekly results, we always recommend to focus on delivering a minimum viable product (MVP) within the first month of starting the project.  Sometimes, this model requires cutting a few corners on the infrastructure side on the early days but it guarantees the constant feedback from the ultimate users which will help to continuously improve the data science solution.

Test Test Test: Make the Models Testable

Complementing the previous point, it is very important to provide mechanisms to continuously test and validate machine learning algorithms even if the solution is running in production. Building testing models is an often overlooked aspect of enterprise data science projects but one that becomes critical to guarantee the evolution of the solution.

Monitor Everything: Implement Operational Monitoring in Your Data Science Solutions

Monitoring the execution of machine learning models, data inputs and outputs, model failures etc becomes essential for the production readiness of an enterprise data science project. In that sense, IT organizations should considering implementing the correct operational monitoring and instrumentation infrastructure as part of any data science project. While conceptually obvious, incorporating these capabilities in a data science solution is far from trivial as most operational monitoring platforms are still not integrated with machine learning and data science stacks.

Start Small, Fail Fast and Iterate

Machine learning and data science solutions are new initiatives for most enterprises and one that requires new skillsets and practices. In that sense, it is important to approach these projects in a highly iterative manner and allocating room for initial failures. While the limitations of legacy data science technology stacks prevented organizations from applying agile and lean development practices to data science projects, this is no longer the case. Today most of the modern data science and machine learning stacks provide enough capabilities that allow organizations to start delivering results extremely fast with a minimum investment.

Leave a comment

Posted by on August 5, 2015 in Uncategorized


Tags: , , , , ,

5 on 5: Demystifying Machine Learning in the Enterprise


Machine learning is become one of the most important trends in the next generation enterprise data solutions. The evolution of machine learning platforms as well as complementary technology movements such as big data has lowered the entry point for organizations embracing machine learning models to drive more effective business intelligence.

Despite the remarkable technological advances in the last few years, enterprise machine learning remains surrounded by strong myths. We regularly encounter those myths during our work with large enterprises around the world implementing data science and machine learning solutions. This brief article is our attempt to demystify some of the most common misconceptions about enterprise machine learning and also takes a look at the technology stacks that are helping to democratize machine learning in the enterprise.

5 Myths of Enterprise Machine Learning

Implementing Machine Learning is Expensive

If you were implementing a machine learning solution a few years ago, you were stuck with commercial packages ranging on the high six figures to low seven figures that also require a lot of professional services to be implemented. Consequently, there is a myth that machine learning implementations need to be unreasonably expensive. The last few years have seen an emergence of a new group of platforms that have helped to commoditized the price of machine learning platforms while also lowering the entry point for developers and architects looking to implement these types of solutions. Today, it is possible to get up and running with a machine learning solution in a few weeks without spending anything on software licenses.

Is Impossible to Build In-House Expertise in Machine Learning

A side effect from the previous myth. Machine learning has been traditionally seen as a professional services intensive endeavor. While it is true that an organization could benefit from starting their machine learning journey accompanied by the right experts, it is also true that today machine learning platforms provide a low entry point for developers and architects looking to work on the next generation data analytics solutions. In that sense, it is factually possible for an enterprise to start building machine learning knowledge in house while leveraging an expert firm to help them take the initial steps in that journey.

We Need Data Scientists

Machine learning is typically seen as a disciplined practiced by introverted data scientists or statisticians who wear thick glasses and are the only people capable to reasoning through machine learning data and algorithm. This myth couldn’t be further from the truth. Most modern machine learning platforms includes dozens of well understood algorithms that can be enabled with minimum level of effort.

Machine Learning is About Predictions

People mistakenly associate machine learning with data predictions. While predictive analytics is certainly a popular disciplined in the machine learning space is far from covering the entire value of machine learning solutions. Classification, clustering, regression algorithms are incredibly useful to help enterprises extract value from data assets and they are typically simpler to implement than predictive models.

We Need a Big Data Infrastructure to Implement Machine Learning

The recent evolution of machine learning platforms was, arguably, catalyzed by the explosion in big data technologies. Consequently, many organizations feel they are not ready to take advantage of machine learning until they can implement a proper big data infrastructure. While leveraging big data infrastructure brings certain advantages, modern machine learning platforms work effectively against traditional enterprise relational data stores and data warehouses.

5 Technologies that Simplify Enterprise Machine Learning

Azure Machine Learning (

Azure native cloud-based predictive analytics service that makes it possible to quickly create and deploy predictive models as analytics solutions. Azure ML provides a visual environment to create ML models as well as an API model to access the models programmatically. Azure ML also allows a developer to use languages like R or python in their ML models.

AWS Machine Learning ( )

Similar to Azure ML, AWS ML Service provides a series of tools and algorithms that allow developers to start building and using machine learning solutions without a heavy investment on infrastructure.

Spark MLib(

The incredibly popular Spark platforms includes a very simple model to execute machine learning algorithms using MPP scale. Interestingly enough, Spark and AWS is now fully supported in Azure and AWS which makes it an interesting complement to the native machine learning engines included in those platforms.

Scikit Learn(

One of the most powerful ML frameworks in the world. Scikit learn provides a series of python based libraries that include over 50 ML algorithms and has a very vibrant community behind it.

Mahout (

Even though Mahout has seen its popularity eclipsed by the raise of new machine learning platforms, it remains incredibly relevant when comes to evaluating machine learning solutions in the enterprise. Mahout provides a large gallery of machine learning algorithms optimized to work in Hadoop infrastructures.

Other Platforms

The aforementioned technologies form a core group of platforms that are actively driving machine learning adoption in the enterprise. Like any other fast growing technology space, we are seeing an increasing number of platforms that are bringing new and innovative capabilities to enterprise machine learning solutions. Consequently, the previous list is likely to increase in the next few months but it is a good place to start today.

Leave a comment

Posted by on July 29, 2015 in Uncategorized


Tags: , , , , , ,

Microsoft and IBM Earnings: The Battle of New vs. Legacy Businesses


IBM and Microsoft reported earnings last week and the reports clearly highlighted the current stage in the transformation of each companies. Arguably the two most important software companies of the last 40 years, Microsoft and IBM used to exhibit similar patterns in the earnings reports a few years ago. However, those days are long gone.

Microsoft and IBM are companies in the middle of aggressive transformations to adapt their business models to a world dominated by technology revolutions in areas such as mobile, cloud, data science, augmented reality etc. The earnings reports clearly illustrates the effectiveness of the transformation process in each company contrasting the grow in new strategic areas with the decline of traditional business.

Microsoft: Growth in the Right Areas

Microsoft’s earnings reports can be summarized in a single sentence: “Growth in the right areas”. The Redmond giant reported remarkable growth in areas such as cloud, devices that compensate the decline in traditional areas such as Windows and Office.

msft stock

Cloud revenues were the positive highlight of Microsoft’s earnings dominated by the company posting a $2.1B loss from the Nokia acquisition.  Commercial cloud ARR grew 88% (or 96% in constant currency) to $8 billion. This continues strong sequential performance from two quarters ago ($5.5 billion) from last quarter ($6.3 billion) to this quarter. These numbers prove that the investments in the unique combination of Office365, Azure and CRM Online is producing strong results.

Devices and search were other areas of improvement. Surface revenues grew 117% which contrasts with the poor IPad sales numbers disclosed by Apple. Bing also exhibited strong growth with a market share that now reaches 22%.

In summary, Microsoft earnings shows strong performance in strategic areas such as cloud, devices with a strong presence in the enterprise. These news have to be encouraging in anticipation to the Windows 10 launch next quarter.

IBM: Too Much Legacy

IBM earnings report resulted in the 13 consecutive quarter of revenue declines. Even though the company repeated the message about its current cloud transformation process, it couldn’t avoid missing revenue estimates. Revenue was down 13% affected by the strong dollar and net income was down 15%.

IBM stock

Similar to Microsoft, IBM reported strong growth in strategic areas such as cloud and analytics. Growth in those two areas came in at more than 20% while software revenues overall were down 10% for the quarter at $5.8 billion and down 3% adjusted for foreign currencies, while middleware was down 7% and essentially flat when adjusted for currency.

In summary, IBM is also exhibiting strong growth in strategic initiatives but the not enough to offset the decline of traditional businesses such as hardware or professional services. From that perspective, it seems that IBM has a long road ahead in its transformation processes.

Leave a comment

Posted by on July 27, 2015 in Uncategorized


Tags: , , , , , ,


Get every new post delivered to your Inbox.

Join 65 other followers