Category

Cloud Computing

Hoisting big data to the cloud

By | Cloud Computing

The general consensus about big data has been that it’s too expensive to move. But what if you could use remote backup copies in the cloud for analytics, app dev, and other applications?

At first glance, big data analytics seems to be the perfect sort of workload for the public cloud.

Particularly for batch jobs, the hyper scalable infrastructure of the cloud is ideal — pay for the powerful server clusters you need while crunching data and stop paying when you’re done. No wonder that after EC2 and S3, one of the first major services AWS added was its Hadoop-based Elastic MapReduce service, followed by Redshift data warehousing a couple of years later.

To build or to buy IT applications? InfoWorld sheds light on this eternal question. | Get the latest insight on the tech news that matters from InfoWorld’s Tech Watch blog. ]

But there’s a problem: Big data doesn’t like to be moved. The bandwidth to do so costs money, and as analytics veer ever closer to real time, the barrier to keeping cloud data and on-premises data in sync grows higher.

Here’s where the idea of copy data management and virtualization comes in. More companies are looking to the public cloud for backup and DR. So instead of data just sitting in the cloud waiting for disaster so it can restored on premises, why not use virtual copies of that data for big data analytics or dev and test in the cloud?

For the most part, cloud backup and DR has primarily been a small business proposition — while large enterprises that want to maintain high availability have created dedicated backup datacenter sites where data is replicated frequently at high cost. In neither case has the data been used for anything except restoration in the event of calamity.

Although still relatively small, Actifio is the best known company pitching the idea of using a single, continuously updated copy of enterprise data and creating virtual copies for DR, backup, and analytics — as well as for development and test in a cloud environment. Founded in 2009, Actifio secured a $100 million round of funding in March 2014, led by Tiger Global Management. Actifio has partnered with IBM, SunGuard, and others to provide a platform where a single “golden copy” of the data can be virtualized and leveraged in multiple ways.

Virtual data management addresses a key enterprise pain point. Not only is the volume of enterprise data growing at a ridiculously rapid pace, but data warehousing, Hadoop analytics, and accelerated application development are together demanding copies of that data and putting an ever greater burden on storage infrastructure. If you can have a single copy created for backup/DR purposes, and create virtual rather than physical copies for analysis and development and test, you can reduce the spend on storage infrastructure — whether it resides on premises or in the cloud.

It seems like only a matter of time to me before AWS, Google, and Microsoft get into the cloud data virtualization management game as well. Yes, particularly with data subject to regulation, there will be governance issues to worry about. But copy data management in the cloud has tremendous potential, because big data analytics lends itself to public cloud infrastructure and because dev and test is already one of the top uses of the public cloud.

At the same time, although it’s the early days, streaming analytics and continuous capture of data from the Internet of things are beginning to take shape. And the general consensus is that the cloud provides the best platform for such a widely distributed architecture.

An interesting aspect of all this is that if you’ve already copied your data into the cloud, at what point do you no longer feel the need to keep primary storage on premises? To that degree, copy data management is yet another milestone on the enterprise’s long road to the public cloud.

This article, “Hoisting big data to the cloud,” originally appeared at InfoWorld.com. Read more of Eric Knorr Cloud Computing blog and track the latest developments in cloud computing at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.

Cisco boosts cloud software, lines up ISVs to write Internet of Everything services

By | Cloud Computing

Cisco added security, management, and hypervisors to its Intercloud Fabric, and signed 35 software developers to create services

Cisco this week enhanced its cloud software and lined up a roster of ISVs to create services for the company’s Internet of Everything initiative.

Cisco added security, management, and support for more hypervisors to its Intercloud Fabric software, an application that connects private, public, and hybrid clouds together for workload mobility. Cisco also enlisted 35 software developers – including Citrix, F5, Cloudera, Hortonworks, and Chef — to build services for the Intercloud and offer them through an Intercloud Marketplace.

[ Learn how to unlock the power of the Internet of things analytics with big data tools in InfoWorld’s downloadable Deep Dive. | Explore the current trends and solutions in BI with InfoWorld’s Extreme Analytics blog. ]

Areas ISVs will target include development platforms for production applications, containers and community-based open source programs; big data and analytics; and IoE cloud services, such as network control, performance, security, data virtualization, energy management, and business services like collaboration and consistent portals from Cisco’s Services Exchange Platform.

+MORE ON NETWORK WORLD: Cisco’s big about-face on cloud services+

Cisco has invested upwards of $2 billion into the Intercloud, a foundational underpinning of its Internet of Everything connected device strategy. Cisco believes 50 billion devices will be connected by 2020, creating an Internet of human-to-human, human-to-machine and machine-to-machine interaction.

Cisco says it has 100 customers and 30 partners for its Intercloud Fabric software. Seven partners — Cirrity, iLand, Peak 10, Presidio, QTS, Quest, Sungard Availability Services – announced new hybrid cloud services built on Intercloud Fabric this week at the Cisco Live conference. And Cisco says customers such as Macmillan Publishing and The Salvation Army are the software to instill a single operational model across production, develop and test, and quality assurance environments.

The latest release of Intercloud Fabric includes security enhancements such as Cisco’s Virtual Security Gateway zone-based firewall. VSG is designed to secure traffic between virtual machines without re-directing that traffic to an edge firewall for lookup. Within Intercloud Fabric, it means that customers who use VSG in Cisco’s Nexus 1000v virtual switch in the data center can extend the same firewall policies to the public cloud, such as Microsoft Azure.

Management enhancements to Intercloud Fabric include extension of VM onboarding to Amazon’s virtual private cloud. This allows businesses to extend Intercloud Fabric management to target VMs already in the Amazon public cloud.

+ ALSO ON NETWORK WORLD Outgoing Cisco CEO Chambers fesses up to mistakes, touts company’s grit +

Additional hypervisor support now extends to OpenStack KVM and Microsoft Hyper-V. This is in addition to existing support for VMware vSphere.

In addition to Citrix, F5, Cloudera, Chef and Hortonworks, ISVs writing to the Intercloud include ActiveState, Apprenda, Basho, Cliqr, Cloud Enabled, Cloudberry Lab, Cloudify, Cloudlink, Couchbase, ctera, Datadog, Davra Networks, desktopsites Inc, Druva, Egnyte, Elasticbox, Informatica, MapR, MongoDB, Moonwalk, Nirmata, Panzura, Pega, Platfora, ScaleArc, SkyTree, Stoamigo, Talisen and Zenoss.

This story, “Cisco boosts cloud software, lines up ISVs to write Internet of Everything services” was originally published by Network World.

IBM puts software and cloud at the center of storage

By | Cloud Computing

The IBM Spectrum Storage strategy embraces multiple vendors and clouds

The future of storage may not be in storage itself, but in the intelligence to manage it.

Major storage vendors and startups alike are now pushing software-defined systems spanning anything from a set of arrays to a whole enterprise. On Tuesday, IBM placed a big bet on this trend, announcing the first product in a portfolio called IBM Spectrum Storage and saying it will invest $1 billion in storage software over the next five years.

The strategy will see IBM offer its traditional storage systems in software form so customers can choose to buy them as appliance, software or service. The first Spectrum Storage product out of the gate is IBM Spectrum Accelerate, software that’s based on the company’s own XIV high-end storage appliance.

IBM envisions Spectrum Storage as a layer of software on top of arrays and other systems, including platforms from third-party vendors. It will span in-house data centers and cloud resources including IBM’s SoftLayer cloud service, moving bits around all that infrastructure to the best location for performance and cost, the company says.

Spectrum Accelerate, like the XIV platform on which it’s based, is designed for disk-based storage but can take advantage of flash as high-speed cache. Users can install the software on any Intel-based storage platform, giving systems they already bought the management intelligence and interface of XIV. The software also can run on IBM Power-based systems.

Among other things, Spectrum Accelerate lets enterprises pool their storage resources and add capacity in minutes, according to IBM. Pooling can cut down on unused capacity trapped in silos, saving space and hardware investments. Administrators can run Accelerate from a graphical user interface that in browsers on desktops and iOS and Android mobile devices. The management software can also be integrated with IBM Spectrum Control. It’s scheduled to ship next month.

IBM Spectrum Storage is also heading for the clouds. With cloud gateway software that’s coming out later this year, users will be able to migrate data to SoftLayer and other cloud services as tiers within their overall storage environment, said Jamie Thomas, general manager for storage and software-defined systems at IBM. This should help organizations deal with geographic and regulatory requirements as well as the changing needs of business.

In addition, users will be able to create a “cloud of clouds” in which one cloud can serve as a bulwark against possible service outages and data loss on another. The gateway will work first with SoftLayer and third-party cloud storage services based on IBM technology, but as customers demand it, IBM will be able to bring other clouds into that fold, Thomas said.

IBM is smart to point its storage strategy toward software, because hardware is no longer what distinguishes storage platforms, IDC analyst Ashish Nadkarni said. Though it’s made moves in that direction before, the new plan and a reorganization show the company really believes it now, he said. Its very visible commitment to the concept through IBM Spectrum Storage may push another big storage player, EMC, to place a bigger bet on software-defined storage, too, Nadkarni said.

This article, “IBM puts software and cloud at the center of storage,” originally appeared at InfoWorld.com. Article written by Stephen Lawson. For the latest business technology news, follow InfoWorld.com on Twitter.

Cloud Services

Don’t Be Scared, It’s Just The Cloud

By | Cloud Computing

There is no question that the use of cloud-based resources affects IT organizations. But how much should your IT organization change to best leverage cloud computing?

I hear that question a lot, and it’s often grounded not so much in process concerns but fear of job loss or devaluation of individuals’ current skills or roles. Such fears are most acute among those who have resisted the cloud for years; they see the writing on the wall, and panic sets in.

[ Cloud computing shares resources that were never shared before, creating a new set of risks and demanding a new set of security best practices. Learn about those new security practices from the Digital Spotlight: Cloud Security PDF special report. | Stay up on the cloud with InfoWorld’s Cloud Computing Report newsletter. ]

The reality is that IT orgs have always changed around the use of technology. This need to adapt is hardly unique to cloud usage, so I’m always taken aback when such change comes as a surprise.

But there is a big difference in how the cloud affects IT compared to previous technology changes: The use of public cloud resources means a shift to resources that the IT org does not control. That change is more profound than individual jobs changing or disappearing — it’s giving up ownership of the actual technology systems, yet still being responsible for them from a business viewpoint.

Despite the control concerns, the cloud allure is too strong to resist. Don’t forget the positive changes it brings to IT. Provisioning, testing, and deployment are easier, for example. Databases can be stood up in a day, rather than the weeks or months of older methods. Thousands of server instances can be provisioned in seconds, and any amount of storage is just a few clicks away.

How will IT need to change due to the cloud? For the most part, cloud computing won’t chainsaw through existing IT orgs. Smart people will be needed to design and build these cloud-based systems and to figure out the synergy with cloud-based resources and existing legacy systems. Now’s the time to ask yourself what kind of structure and people will you need to support the use of cloud.

The changes are actually easy to predict. Security and governance become more important, as do management and monitoring. Development skills will shift some to cloud-based platforms and devops approaches. IT pros currently managing storage and compute services will have to serve double duty with new cloud-based resources to manage.

You should not be concerned if things change. You should be concerned if they don’t — that means you’re in a cocoon the world will pass by.

This article, “Don’t be scared; it’s just the cloud,” originally appeared at InfoWorld.com. Read more of David Linthicum’s Cloud Computing blog and track the latest developments in cloud computing at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.