Where’s cloud computing headed this year? The experts lay down their cards

Edge computing, vertical clouds, a profusion of multicloud options and the end of on-premises backup.

Those are a few of the prognostications being put forward by cloud computing pundits and information technology providers amid the flood of predictions that jam our inbox at this time each year. Here are a few of them, along with my own takes. For more, check out theCUBE on Cloud, SiliconANGLE’s first digital editorial event, on Jan. 21.

Cloud at the edge

As 5G networks near the tipping point this year, many experts expect to see cloud providers racing to support their customers’ ambitions to move more processing power to the point of data ingestion.

Edge computing will complement existing cloud infrastructure by enabling real-time data processing where the work takes place,” wrote Keith Higgins, vice president of digital transformation at industrial giant Rockwell Automation Inc. “Organizations will worry less about logistical IT considerations and instead focus on rethinking what’s possible in a smart machine.”

In the long term, this prediction is a no-brainer because devices such as vibration sensors and traffic lights are the last bastion of disconnected data, but the journey to the edge is more likely to be a plod than a race. There are a lot of problems to be untangled in areas like security, device reliability, connection quality and how existing applications will need to be re-engineered for a vastly more distributed and complex compute plane.

Still, that isn’t stopping the big platform providers from falling all over themselves to stake out positions in customers’ data centers, colocation facilities and inside telecommunications sites. They will be ready when their customers are.

Vertical clouds

IBM Corp. made the biggest splash in this area with the late 2019 launch of its financial services cloud. Although the notion of vertical clouds isn’t new – Virtustream Inc. has one for healthcare and Oracle Corp. for manufacturers — the resources IBM is throwing at the effort are notable as is its collaborative approach to building features in concert with customers.

Cynics may argue that the move is a Hail Mary by a company that has failed to crack the top echelons of public cloud providers. But it’s a sensible play given that financial institutions have been understandably reluctant to entrust their workloads to computing environments they don’t fully control. The bigger question is how many industries really need a cloud built just for them.

The end of on-premises backup

I once encountered a smallish manufacturer whose IT chief admitted sheepishly that its backup server was located on the floor underneath the production server it was there to back up. Most organizations don’t do backup very well, and even those that do admit that data volumes are growing so fast that they can’t keep pace.

Cloud backup not only addresses the need to store data in another location but can be configured to work continuously. Research and Markets Ltd. expects the cloud backup market to quadruple, to more than $8 billion by 2027, growing at an annual rate of 23%.

“Companies will increasingly use the cloud to enable geographically separated offsite replication or failover for disaster protection and extend failover clustering not only across cloud availability zones but across different cloud vendors,” predicted Cassius Rhue, vice president of customer experience at Sios Technology Corp. Perhaps, but I think most will just be happy for now to get rid of tape drives.

Rethinking multicloud

The perception that enterprises want to shift computing workloads effortlessly between multiple cloud providers has been called the most over-rated assumption in the industry. Yes, the desire to use multiple clouds is real, but not to shuffle workloads between them. Rather, they will want to match the best platform to each workload.

David Linthicum makes a compelling argument against multicloud in Infoworld: “You’ll have to take the least common denominator approach to building applications and connecting data storage. This means that you won’t be optimized for any public cloud platform, or that you won’t run well anywhere.”

Patrick Hubbard, head geek at SolarWinds Worldwide LLC, makes the point that “businesses pursuing a multicloud strategy must either wield a high-performing, DevOps-focused team of IT professionals or have the budget to outsource multicloud engineering and monitoring to someone else. They must also have a well-researched case for why they believe multicloud will meet their business needs in the first place.” I suspect at least one of the other of those factors will torpedo the multicloud idea at many companies.

Object storage rules

With unstructured data expected to make up 80% of all information generated in 2025, the case for object storage has never been more compelling. Object storage is inexpensive, massively scalable, secure and easy to access. It’s the primary reason Data Bridge Market Research Pvt. Ltd. expects the cloud storage market to grow 24% annually to more than $270 billion by 2027.

Paul Speciale, chief product officer at storage software provider Scality Inc., predicts that “object storage will become a de facto storage model for data lakes.” Amazon Web Services Inc.’s S3 storage application program interface has become “the standard API for object storage,” he said. “Large semistructured and unstructured data sets are a natural fit.” Makes sense.

 

continue reading at siliconangle.com

Leave a Reply