How Red Hat is making money on the public cloud with a hybrid approach

How Red Hat is making money on the public cloud with a hybrid approach

If a company wants to run Linux in the public cloud, and they’re not yet ready to go “fully native” in the cloud, they tend to run RHEL in both environments, Whitehurst detailed. Red Hat CFO Frank Calderoni then underlined this point: “The large customers that are…using the public cloud with us are also growing substantially their business on the private side, too.”


Red Hat Adds QuickStart Cloud Installer to Red Hat Cloud Suite to Help Speed Private Cloud Deployments

Red Hat Adds QuickStart Cloud Installer to Red Hat Cloud Suite to Help Speed Private Cloud Deployments

Red Hat introduces the Quickstart Cloud Installer, the newest feature of Red Hat Cloud Suite designed to orchestrate and install all Red Hat Cloud Suite components from a single interface.

A single, web-based intuitive interface enables users to provision a fully functional cloud solution using any combination of the Red Hat Cloud Suite components, including:

  • Red Hat CloudForms 4.1;
  • Red Hat OpenShift Container Platform 3.2 (OpenShift Enterprise 3.2);
  • Red Hat OpenStack Platform 8;
  • Red Hat Satellite 6.2; and
  • Red Hat Virtualization 4.

Source: Red Hat Adds QuickStart Cloud Installer to Red Hat Cloud Suite to Help Speed Private Cloud Deployments

6 Things Amazon Can Teach us About How to Do Cloud Computing Correctly

6 Things Amazon Can Teach us About How to Do Cloud Computing Correctly

Good introductory summary of why Amazon AWS is profitable:

  1. AWS adds new server capacity daily
  2. AWS uses redundancy to guard against down-time
  3. AWS builds its own custom servers
  4. AWS designs its own custom hardware and software
  5. Amazon seeks out advantageous locations for its hubs
  6. Amazon guards against over-growth

6 Things Amazon Can Teach us About How to Do Cloud Computing Correctly, By Brigg Patten

Sitting Waiting, Wishing

Sitting Waiting, Wishing

After reading an article about Amazon AWS Route 53 taking on, and running into to a few issues with my WordPress website on Microsoft Azure, I decided to give Amazon AWS a try.

I found creating a Linux VM very easy.  No instructions needed, the UI was very intuitive. I thought creating the WordPress site would be more difficult on AWS.  After all, I had run into a number of challenges with the “automated” WordPress installs on Microsoft Azure (provided by ClearDB and Bitnami).  In short, I spent many hours with tech support from both companies.  Some issues were major, others more trivial – yet still important. For example, Bitnami puts in an annoying “dog ear” in the bottom right corner of the your home page with their logo.  But I digress…

I was pleased to find that creating a WordPress site on AWS (manually) was straight forward.  The instructions were very well written:


Tutorial: Installing a LAMP Web Server on Amazon Linux

Tutorial: Hosting a WordPress Blog with Amazon Linux

Heading Down Route 53

After getting a default WordPress site up and running (in about 30 minutes).  The next step was to transfer my domain name.  I read the AWS Route 53 domain transfer instructions and had some questions.

Transferring a Domain to Amazon Route 53

However, I found a nice, brief (~1 minute) video on youtube that answered all my questions.  Kudos to Sibercat X.

How to Transfer your Domain to AWS Route 53 From Godaddy.

The Transfer Process (and the missing step)

In a nutshell, you start your domain transfer to Route 53 by logging on to and clicking: Manage your domain.  You click on unlock and get an email with your authorization code:

Godaddy-email authentication email

The next step is to plug in your authorization code to the AWS Route 53 console and away you go.

Sitting, Waiting, Wishing

However, 12 hours later, I was still waiting  A check of my AWS Route 53 console reveals the following (step 7 of 14):

AWS Route 53 – Step 7

I decided to do a quick Google search to figure out what’s taking so long.  After all, this can’t be a manual process, it has to automated end-to-end.  I find the following explanation:

Waiting for the current registrar to complete the transfer (step 7 of 14)

Your current registrar is confirming that your domain meets the requirements for being transferred. Requirements vary among TLDs, but the following requirements are typical:

  • You must have registered the domain with the current registrar at least 60 days ago.
  • If the registration for a domain name expired and had to be restored, it must have been restored at least 60 days ago.
  • You must have transferred registration for the domain to the current registrar at least 60 days ago.
  • The domain cannot have any of the following domain name status codes:
    • clientTransferProhibited
    • pendingDelete
    • pendingTransfer
    • redemptionPeriod

The first three bullets are straight forward, I know they don’t apply to me.  However, I find the sub bullet status codes cryptic.  Instead of searching Google again, I decide to log back in to and look around  I click the manage my domain button and voila – I find the problem.  There’s an undocumented step in the transfer instructions. makes you go back and accept the transfer a second time (even after accepting the initial transfer request link from their email). Here’s what you’ll see in’s console:

godaddy-accept – Accept

All you need to do is click on Accept and away you go, you’re on to Step 8.


Liner notes.  The title of this article came from a song by Jack Johnson.





AWS Enters The Field Dominated By GoDaddy

AWS Enters The Field Dominated By GoDaddy, Inc. (NASDAQ:AMZN) is expanding into the digital certification space through its cloud arm Amazon Web Services (AWS). The move by Amazon challenges the positions of Godaddy Inc (NYSE:GDDY) and Symantec Corporation (NASDAQ:SYMC), which make money from offering digital certification.

“for developers using AWS, the,digital certificates would be a great way to boost search engine ranking”

Amazon Route 53 is a highly available and scalable cloud Domain Name System (DNS) web service. It is designed to give developers and businesses an extremely reliable and cost effective way to route end users to Internet applications by translating names like into the numeric IP addresses like that computers use to connect to each other.

Docker: 70,000% increase in job postings

Docker: 70,000% increase in job postings

Amazing stats posted by 


More than a 70,000% increase in job postings around the world. It’s safe to say that companies all over are investing heavily in Docker to transform their application lifecycles. That means it’s a great opportunity for developers and ops pros who already have hands on experience building, shipping and running Dockerized distributed applications. Docker is also an essential tool for any DevOps team and their CI/CD pipeline and will be an incredible asset as you look for jobs in the fast moving DevOps job market as well.

Full Story from blog.

PC Magazine Rates Microsoft Azure…

PC Magazine Rates Microsoft Azure…

Good Azure review from STEVEN J. VAUGHAN-NICHOLS

Microsoft Azure is one of the easiest clouds to get up and running. Once in place, it’s also among the easiest to manage.


  • Windows compatibility. Linux and container compatibility (yes, you read that right). Good front-end management interface.


  • Average performance. High cost.


  • Those who have built their businesses around Windows will want to use Infrastructure-as-a-Service (IaaS) solution Microsoft Azure. Those who rely on Linux should still take a close look at Microsoft Azure as Microsoft does a decent job there as well.

Full Story: PC Magazine

Microsoft’s comprehensive Availability on Demand (AoD) solution ASR goes live

Microsoft’s comprehensive Availability on Demand (AoD) solution ASR goes live

Migration capabilities are available for a range of workloads including Exchange, SharePoint, and SQL Server:

  • Move applications with near-zero downtime: Move a single application or an entire datacenter to the cloud with minimal impact to production users
  • No-impact migration testing: Replicate production workloads into Azure, execute tests to ensure readiness, then onboard users into the cloud for no-impact migration
  • Replicate data once, for migration or recovery: With ASR and AoD, you replicate application data only once, and can then use that data to perform disaster recovery, migrate workloads, or create DevTest environments in Azure

Azure Site Recovery GA: Move VMware, AWS, Hyper-V and Physical Servers to Azure | Microsoft Azure Blog.

The new lock-in – the cloud, in short, is sticky

The new lock-in – the cloud, in short, is sticky
Great summary from Matt Asay of Infoworld about cloud strategy from an ISV's perspective.

“When Microsoft said it wouldn’t rule out putting Windows in the open source domain, people scoffed — but it could be a shrewd business move for the cloud era”…

“It turns out switching costs in the cloud are equal to or greater than what they were in the on-premise era.”…

“Once I build my app on AWS or, more poignantly, dump my corporate data into Salesforce, the likelihood that I’m going to be able to easily switch is less than zero.” …

What if Microsoft really did open-source Windows? by Matt Asay