CIOs like big data, just not in the cloud

Although cloud providers like Amazon are linking big data with the public cloud, enterprises rightfully don't see it that way

Posted taken from  David Linthicum | InfoWorld, MAY 03, 2013

Amazon.com CTO Werner Vogels recently made the case for big data computing in the cloud. But what else would you expect him to say?

The points made by Vogels are compelling, including a prediction that demand for big-data analysis is spurring interesting in real-time analytics. Enterprises thus need capacity; to Vogels, this means they need the public cloud -- Amazon's public cloud in particular. Vogels also said we can expect infrastructure like Hadoop to become invisible behind a cloud-provided analytics layer such as (of course) Amazon's Redshift.

[ Get the skinny on big data with InfoWorld's "Big Data Analytics Deep Dive" and "Hadoop Deep Dive" -- two PDF special reports you won't find anywhere else. | Stay up on the cloud with InfoWorld's Cloud Computing Report newsletter. ]

Vogels is half right today, and he could be completely correct in five years.

The reality is that big data is, well, big. Most enterprise have some sort of big data project under way, and they see much the same benefits as Vogels does, such as the move to real-time analytics, including predictive analytics that CIOs believe will add a huge amount of value to what enterprise IT can do.

Public cloud computing platforms are indeed compelling. Consider the instant scalability from using auto- and self-provisioning services and from using built-in big data services such as Hadoop. However, in reality, most of what can be called big data today is still very much in the enterprise data center. It may remain there for some time.

The reasons are understandable: The use of local data storage systems means that integration with operational data stores won't be as much of an issue as it would over an Internet connection. In many instances, using public clouds as the place to store huge amounts of enterprise data seems like a good idea, until you have to ship off the USB drives to the cloud provider and hope they load correctly.

Also, while I think security and compliance are typically solvable problems in public cloud computing, they are easier to work with if the data is local. Moreover, performance is better with local data because you're not dealing with the latency of sending requests and returning data sets over the open Internet. Finally, hardware and software are cheap these days, and the ROI of putting these systems on public cloud providers versus internal servers is not as persuasive as you might imagine.

Should you discount public clouds as an option for building big data systems? No -- for the first generation of big data systems, most enterprises shouldn't choose the cloud. But that calculus will change over time.

From Maps to Search to Google+, Google unloads improvements across the board

Google execs cram dozens of announcements into its opening keynote at Google I/O


Posted taken from Ted Samson | InfoWorld ,MAY 15, 2013





In an extensive keynote this morning at Google I/O, company bigwigs unveiled a dizzying array of new products and features spanning the company's technology portfolio, including Google Play, Android developer tools, Google Compute Engine, Google Maps, Google Search, Google+, a new music service, and more. Google_New_hp

Though there weren't any major announcements -- say, a new version of Android -- plenty of juicy tidbits were introduced to tantalize developers and end-users alike.

Google puffs up its cloud
On the cloud front, Google announced the general availability of Google Compute Engine, the company's environment for running virtual machines. Compute Engine, available immediately via cloud.google.com, now includes such features as shared-core instances, aimed at low-intensity workloads; advanced routing, to help users create gateways and VPN servers for apps that span local networks and the Google cloud; and large persistent disks that support up to 10TB per volume.

Google also unveiled App Engine 1.8.0, which includes a limited preview of the PHP runtime. The addition of PHP will enable developers to run open source apps like WordPress. It also offers deep integration with other parts of Cloud Platform, including Google Cloud SQL and Cloud Storage, according to Google Senior VP Urs Holzle.

The new version of App Engine also enables users to more easily build modular applications. Users can partition apps into components with separate scaling, deployments, versioning, and performance settings.

Additionally, Google unveiled Google Cloud Datastore, a fully managed and schemaless solution for storing nonrelational data. The service is based on App Engine High Replication Datastore, and it features automatic scalability and high availability, along with capabilities like ACID transactions, SQL-like queries, and indexes.

Speak, and ye shall search
Google revealed a forthcoming "conversational assistant" for Google Search, capable of answering spoken questions -- "Will it be sunny in Santa Cruz this weekend?" for example. Once Google delivers an answer, you can continue the conversation with follow-up questions like, "How far it from here?" or "How about Monterey?" according to Google Senior VP Amit Singhal.

Google also announced updates to Google Now, an Android app that delivers personalized information on the fly by generating "cards" with information based on a user's location and activities. It can present information on such topics as traffic, weather, sports, flights, appointments -- and with the update, public transit commute times, movies, TV shows and video games. Google Now also includes reminder functionality, which can be triggered by a time or a place. For example, you could set up a reminder to call your manager when you arrive at the airport. If you're about to miss the last train home, Google Now can remind you that you'd better leave.

Plotting better maps
Google also unveiled a new version of Google Maps, which is more comprehensive and accurate, plus it delivers enhanced imagery and navigation. Beyond tweaking the appearance of maps to highlight only roads and landmarks that are important to the user (such as just the roads you need to worry about when driving to a destination). What's more, Google has designed the new version of Maps to tailor itself to users. "When you set your Home and Work locations, star favorite places, write reviews and share with friends, Google Maps will build even more useful maps with recommendations for places you might enjoy," said the Google Maps team.

Among other improvements to Maps, search results appear directly on maps with brief descriptions and other information, such as recommendations from peers. Info cards provide information like business hours, ratings, and reviews.

Google+ gets a makeover
Google announced some changes to its social networking program, the most interesting being the creation of a stand-alone version of Google Hangouts. The free app, which runs on Android and iOS, combines text, photos, and live video. Features include the ability to review conversation history with a swipe and the ability to inject images into conversations.

Speaking of images, Google has added photo-editing and enhancing features to Google+, including an instant upload feature, with which you can have every photo you take saved to the cloud (up to 15GB worth) for free. There's an Auto Highlight feature, which automatically sifts through images to de-emphasize duplicates, blurry images, and poor exposures while highlighting pictures of key people and landmarks. A new Auto Enhance feature automatically improves brightness, contrast, saturation, structure, noise, and focus.

Finally, a feature dubbed Auto Awesome creates new images based on a set of photos in a library. For example: If you were to upload a few family portraits, Google+ would sift through them for everyone's best smile and stitch them together into a single shot.

This story, "From Maps to Search to Google+, Google unloads improvements across the board," was originally published at InfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest developments in business technology news, follow InfoWorld.com on Twitter.
 



Amazon's Vogels: Big Data Belongs In The Cloud

Amazon CTO Werner Vogels predicts real-time analysis and "invisible" Hadoop capacity on demand are the future of big data computing.
1197

Posted taken from Doug Henschen  InformationWeek
April 22, 2013 09:06 AM


Amazon CTO Werner Vogels kicked off the annual Amazon Web Services Summit series in New York last week with a vintage cloud-will-be-king presentation that made a strong case for big data computing in the cloud. And Vogels offered a few predictions for what will drive cloud-based data analytics.



One, he predicted demand for big data analysis will spur interest in real-time analysis, and that companies will have to respond with unlimited capacity as needed.

Second, he said we can expect infrastructure like Hadoop (delivered by Amazon as the Elastic Map Reduce (EMR) service) in the future will "become invisible" behind analytic layers built on top of Hadoop. He described today's big data analysis tools as "rather crude."

Third, he said that this layer of big data analytics will include big-data-powered industry-specific applications.




[ What's Amazon's biggest rival up to? Read Microsoft Azure Public Cloud Matches Amazon Prices. ]

This slick new layer of next-era tools doesn't exist yet, but to prove the industry-focused point, Vogels introduced executives from Bristol-Meyers Squibb, GE and big data analytics startup Mortar Data (among other companies) to detail AWS-powered big data applications.

-- Bristol-Meyers Squibb IT executive Russell Towell described how the drug giant is using AWS to do computer simulations to optimize large-scale drug trials before actually conducting them with patients. The company uses AWS security provisions including private connections to Amazon data centers, Amazon Virtual Private Cloud services and encryption of all data, Towell said. Bristol-Meyers Squibb researchers can spin up scores and even hundreds of Linux server instances within five minutes and preconfigured Oracle Database instances within 12 minutes, he said.

Workloads that would have taken 60 hours to provision and complete on-premises using the company's old approach (and requiring huge investments in server capacity) now take 1.2 hours on AWS with service fees of $336, he said. As a result, the company can quickly do "thousands rather than hundreds" of simulations in the same amount of time, Towell said.

The big payoff for each simulation is money saved on live clinical trials. Having completed simulations, the company can reduce the number of patients required for a trial while being certain of valid results. Trial costs that averaged $750,000 have been cut to $250,000, according to Towell.

-- General Electric executive Joe Salvo, manager of the manufacturer's Business Integration Technologies Laboratory, touted a collaborative platform that GE built on AWS that's intended to help manufacturers and suppliers bring together expertise, materials data, and modeling and simulation capabilities to speed part and component development times by as much as five times. GE calls it a CEED -- crowd-driven ecosystem for evolutionary design.

"It's a flexible, elastic environment on [Amazon] EC2 that supports both rapid prototyping, simulation and, ultimately, building real parts that go into complex products and systems," Salvo said. "The teams come together quickly, they exchange their data and models [securely] ... and it holds the promise of transforming the whole manufacturing paradigm."

-- Mortar Data CEO K Young cited elastic capacity as the key to the 2011 startup's ability to grow quickly and provide Hadoop-as-a-service capacity without having to buy and set up servers. Mortar has raised $1.8 million in capital, and in 2012 the company spent some $500,000 on AWS services, using some 1,000 servers on demand. Provisioning that much capacity in a conventional on-premises data center would have cost $7 million and taken eight months to bring online, Young said.

"We're able to serve new customers without delay and without upfront costs, and we can start bringing in new revenue, and we're able to do it using about a quarter of what we would have had to raise otherwise," Young said.

Services To Come

Vogels' point about companies needing to tap into capacity on demand is an obvious selling point for AWS. His predictions about real-time analysis and the inevitability of analytic layer on top of invisible Hadoop infrastructure could well be a tease to coming AWS services announcements.

For example, Amazon has yet to join the SQL-on-Hadoop trend that is driving multiple projects and initiatives aimed at delivering faster and more extensive SQL querying capabilities on top of Hadoop than are currently supported by Hive. Lead Hadoop distributor Cloudera, for example, is promoting project Impala, while competitors EMC (Pivotal HD), Hortonworks (Stinger), IBM (Big SQL), MapR (Apache Drill) and Teradata (Teradata SQL-H) each have their own SQL-on-Hadoop initiatives in the works.

On coming up with an analytics layer, Amazon has heretofore partnered with BI and analytics vendors including Actuate, Birst, GoodData, Karmasphere, Pentaho and others. It would be interesting (and not terribly surprising) to see Amazon acquire or invest in BI and analytics technologies for Hadoop and other platforms. In the database arena, Amazon took a large equity stake in ParAccel, for example, to gain licensing rights to the high-scale, massively parallel-processing database now behind the Amazon RedShiftdata warehousing service. This could be the model for an analytics play.

Amazon did make two notable database-related announcements at the AWS Summit, one aimed at incumbent-database customers and one aimed at moving them to Amazon's big data services. In the first case, Vogels announced that encrypted data storage and network data flow is now available for Amazon Relational Database Services for Oracle Database and will soon be available for Amazon RDS for Microsoft SQL Server. Amazon still has to allay corporate concerns about putting data in the cloud, so this announcement is aimed at companies using incumbent platforms.

As for those looking toward new platforms, Vogels announced that Amazon's DynomoDB NoSQL database has gained an important new analytical capability through a feature called Local Secondary Indexes.

"This allows you to perform queries on any attribute in your data model, so now you have all the power of querying that you're used to with relational databases available to you on DynamoDB," Vogels said.

The announcements fit a pattern for Amazon in which it offers familiar tools (like Oracle Database and Microsoft SQL Server) while also pioneering and promoting new platforms (like DynamoDB, Hadoop and Redshift). As always, the cloud is the place to do it all.

Companies want more than they're getting today from big data analytics. But small and big vendors are working to solve the key problems. Also in the new, all-digital Analytics Wish List issue of InformationWeek: Jay Parikh, the Facebook's infrastructure VP, discusses the company's big data plans. (Free registration required.)
 



“Why Cloud?” vs “What’s Next For Cloud?”

If 2012 was the year that cloud really came into itself as a fully fledged concept, how has 2013 shaped up?

Posted taken from GATHERING CLOUDS, MAY 15, 2013 05:45 AM EDT, Cloud Computing Jurnal





We all talk about cloud differently, but is there a way we should be speaking about this tech?

Cloud computing is now a widely reported, if not accepted, IT movement that, depending on who you talk to, has changed or is changing the way businesses utilize infrastructure.

The question remains, however, that if 2012 was the year that cloud really came into itself as a fully fledged concept, how has 2013 shaped up? As we reach the near midpoint of the year, its hard not to reflect on whether the conversation around cloud computing has evolved to the point that we are no longer talking about the benefits of the service itself, but whether what the service enables a business to do is much different than what had previously been possible.

Looking at the headlines on GigaOm, Techcrunch, Venturebeat, and other outlets, the reporting seems to be focused more on what cloud companies are doing from an M&A standpoint than on the technology or its value to businesses generally. Sure, there are explorations of OpenStack, but those reports seem to be focused on whether anyone actually cares or uses the platform.



Gathering Clouds constant favorite David Linthicum delves into the challenges and thinking that govern cloud strategy, as do one-off thought leaders here and there. But where is the conversation around cloud actually focused?

For the most part, we see the conversation around cloud as evolving past how the technology works. Many of the major thought leaders in the space talk about cloud in terms of what people do through it – not what cloud actually does.

This, though, presents a leap beyond  understanding how to get to those positive outcomes. Believe us – cloud should be talked about in terms of what it does for a business and what a business can do through cloud. But without the correct framing of what the technology offers as a platform for innovation its hard to make those jumps.

Cloud is still developing, but as it becomes more the norm for how IT functions, there is more reason (not less) to explore how to think about the technology from an implementational standpoint. And we’ll keep providing those perspectives.



Small businesses eye cloud for efficiency, cost savings

Posted taken from Roger Yu, USA TODAY4:47 p.m. EST November 6, 2012

Drawn by the promise of simplicity and low cost, entrepreneurs are increasingly tapping into the cloud to conduct business in ways that makes pen and paper and desktop software obsolete.


Entertaining toddler princesses is a walk in the park for Sharon Chase. Accounting for the money she earns from it is a hairier proposition.

Chase found it frustrating to juggle the multiple bank and PayPal accounts used to pay vendors — not to mention the other software she employed to stay on top of Princess Sharon Events, her birthday entertainment service company in Cohasset, Mass.

"Everybody said QuickBooks is so simple, and I'd get lost," she says. "I get intimidated. ... And I started trying to make a spreadsheet, and I'd sweat and throw it away."

Then Chase stumbled onto Outright.com, unwittingly joining waves of other small-business owners who have turned to "cloud computing." That's the tech industry's way of referring to applications online that allow users to input, edit and manipulate data stored on servers located elsewhere, often hosted by the application developers.

Outright.com, Chase says, is less intimidating and simplifies the accounting by allowing her to juggle multiple accounts, generate reports on spending patterns and itemize even obscure expense categories, such as her PayPal fees and the iTunes songs she buys for party entertainment. "It's a kindergarten-in-the-rug-kind of experience," she says.

Drawn by the promise of simplicity and low cost, entrepreneurs are increasingly tapping into the cloud to conduct business in ways that make pen and paper and desktop software obsolete.

Cloud computing started mostly as backup storage. But because software resides in the cloud and not on an isolated computer at a desk, developers can integrate multiple applications in one, simultaneously sync data across numerous devices and update information real-time for mobile device users.

Cheap and easy cloud-based applications can be used for a range of tasks and services, from bookkeeping to conference calls to managing complex projects with far-flung colleagues.

A survey of information technology professionals by Spiceworks this year found that 62% are using some type of cloud application, up from 48% at the beginning of the year and 28% a year ago. "If you're starting a business, the world is your oyster and you can do things in super cool ways with the cloud," says Jay Hallberg, co-founder of Spiceworks, a social network for IT professionals.

Cloud options also free entrepreneurs from having to staff a large IT department, by passing the maintenance burden to application developers. "Sometimes we release multiple (versions of our application) a day," says Mike McDerment, co-founder and CEO of FreshBooks, an accounting application online for small-business owners.

SIMPLICITY

Traditional desktop software often sought to be all things to all users, pleasing few.
But many cloud-computing developers, in marketing their products, tap into the anxiety of small-business owners by selling simplicity and focusing on underserved niche areas.

John Bracken founded Speek.com, a Washington, D.C.-area start-up that enables cloud-based teleconference calls, to ease the cumbersome task of rounding up attendees and dialing on speakerphones. An average call takes five minutes to coordinate, he says. "No one knows who's coming. You beep and you say 'who's that?' " he says.

With Speek.com, users tap on a personal, dedicated Speek.com link that works as a call invitation. And Speek's hub in the cloud links attendees to the call. About 90% of usage is small business, Bracken says.

With a niche focus, developers can also breed innovations that weren't possible with software that came in boxes, such as integrating the functions of other software by using open developer tools, says Rene Lacerte, CEO of Bill.com, which specializes in billing.

On Bill.com, users can tap into banks' online banking tools and other broader accounting suites, such as Intacct and Quicken. "In the last 10 years, the cloud changed the paradigm about how software is built," Lacerte says.

COLLABORATION AND MOBILITY

The cloud also makes it easier for users to collaborate on projects and enables multiple users to get access to the same application at the same time without losing their data or mixing it up. Callers on Speek.com, for example, can upload files to its servers, and opt to have them synced with Dropbox accounts.

Michael Hsu, founder of Deep Sky Accounting in Irvine, Calif., says his operation has been mostly turned over to cloud-based applications to eliminate confusion and inefficiency of "too many hands in the jar."

One of Hsu's clients, an interactive marketing agency, previously used a spreadsheet to track invoices and checks, occasionally leading to outdated information and checks that were issued without authorization. Using cloud-based applications — Intacct, Bill.com, and GetHarvest.com for time tracking — everyone works from updates that can be seen online by all instantly, he says.

"It's about the whole concept of having one ledger. You have one thing in common, and you're just editing it," he says. "Back in the old days, you send it back and forth with clients. Now all of us can work together."

Project management tools like Do.com, which is owned by Salesforce, allow you to share and organize tasks, track contracts, send files and share feedback among collaborators. With Chatter.com, another Salesforce site, colleagues can create a cloud-based private social network for collaborative projects that require posting updates and files.

Freed from the fear of mismatched information and outdated data, users of cloud-based applications collaborate twice as much as those using similar software in the desktop, says Dan Wernikoff, general manager of Intuit's financial management solutions division, which has released cloud versions of its products, including QuickBooks.

A cloud version of QuickBooks has been around since 2000, but users were not comfortable storing data online. It struggled to get new customers. In the first eight years, it landed only 100,000 customers. But with the advent of smartphones and online data-storage products easing such fears, Intuit has seen about 300,000 more customers in the past four years, Wernikoff says. About 40% of first-time customers are trying online products. "Next year, it'll be the bulk of our new users," he says.

The emergence of mobile technology also partially explains the rush to cloud computing by entrepreneurs. Not tied to the office and typically dealing with a workforce that's widely spread out throughout the country, business owners are opting for the any-screen-anywhere strategy.

"Consumers want immediate access to information wherever they want," says Kevin Garton, chief marketing officer of The Neat Company, which recently released cloud and mobile versions of its document-filing system — NeatCloud and NeatMobile. "Cloud acts as the central database and as the synchronization device for desktop and mobile devices."

With so much data stored, cloud companies can also dive deeply into customers' usage patterns for information that may prove useful. For example, FreshBooks offers a feature that compares your business' performance with competitors in several benchmarked areas, including how long it takes you to get paid and the average invoice size.

QuickBooks has a feature that can comb through your transaction data from the customer list and filter information on those who haven't visited in 12 months. Business owners can use the information to, say, offer a special discount.

NOT ALWAYS PERFECT

Cloud computing comes with several caveats. The ability to tap into the cloud is only as good as your Internet connection and the capability of your host's servers. "Cloud may not be enough. There is a speed problem working with large files," says Vineet Jain, CEO of Egnyte, whose firm develops software that allows companies to integrate the cloud with local area network servers in the office.

Security remains a chief concern. Many security issues from the cloud's early years have dissipated, and much of cloud-stored data is encrypted or in read-only mode for non-owners. But nearly three-quarters of respondents in the Spiceworks survey cited a lack of control and security issues as their biggest concerns.

"It takes a long time for everybody to get comfortable with the idea," says Hallberg of Spiceworks. "It's not in your physical control. Despite what vendors say, (converting to the cloud) is not just drag and drop."

Why Google's cloud is a Pandora's box

Posted taken from Andre Mouton, Minyanville10:18 a.m. EST February 11, 2013, USA TODAY

ap-google-antitrust-prob_001-4_3_rx404_c534x401

Google (GOOG) looks at the cloud and sees the future. It envisions the growth of massive data centers, where consumers and businesses will store their information, run their programs, and develop new software. In this world, everything will be done at the center, and the PC will have dwindled down to little more than an emaciated screen. We will trust our digital lives to servers half a world away, and be rewarded with a network that knows us so intimately, it can cater to our every need.

Others look at the cloud, and at Google, and what they see is a problem. There's now a lengthy Wikipedia page dedicated to Google-angst. Last year, a modest change to the company's privacy policy sparked an immediate controversy. The Internet is no more unsafe or insecure than it ever was, but we have become more connected -- more vulnerable. We're living more of our lives on public networks, observed by advertisers, employers, and sometimes, complete strangers. Control of personal data is becoming a point of contention, and it's an issue in which Google is generally seen as being on 'the wrong side.'

If the company's vision is ever going to become a reality -- if the cloud is not going to simply complement our PCs and post-PCs, but to a large extent, replace them -- then people will need to feel a lot more trust than they do today. Trust that Google won't abuse its right to do whatever it wants with data stored on its servers, and trust that their information is safe from law enforcers and government seizure. Trust that Internet service providers will provide a reliable, affordable Internet connection, and that this connection won't be throttled, or degraded because of some regulatory conflict like the one over network neutrality.

They'll also need to believe that the cloud is everything it's billed as, and so far, that's not the case. We're told that it's less expensive to move to the cloud, but this is only true in the sense that it's cheaper to rent than to buy. You can purchase a low-end Chromebook for several hundred dollars; but it's essentially a Web browser with a body, so to get any kind of functionality out of it you'll also need to buy a data plan. Mobile connections are pricey -- an iPhone is typically less expensive than the data plan that comes with it -- and if you fly often, or otherwise leave the network, you're stuck paying high rates in whatever local hotspot you find yourself in. Unless you plan to sit at home, staying connected 24/7 is an expensive proposition.

Hidden costs are everywhere in the cloud, and sometimes they aren't monetary. Consumers of 'free' online services, including Facebook (FB) and other social media sites, usually discover that there's no hanging up on salespeople when you're eating at their dinner table. Incremental waits and little inconveniences add up. Enterprise customers find that the cloud makes it easy to outsource IT support, but unless your firm chooses to pay a premium rate, it will be sharing servers and bandwidth, and waiting in line for both. Large file loads can take weeks to back up.

Meanwhile, a lot of cloud content is either loaded with advertisements, or unprofitable. Google Docs is given away for free, so its value to customers can only be guessed at. Google Apps for Business is only available to paying customers, but it's unclear whether Google is getting any kind of margin on its sales. Meanwhile, Salesforce.com (CRM) is a venerable cloud institution, able to charge higher rates than its competitors -- and it's still operating in the red. Can we even trust that there's an efficiency here, and not just the sort of growth-at-a-loss that drove so much of the dot-com boom?

There are too many unknowns. In some ways, the cloud is the exact opposite of the computers that Google expects it to replace. When we buy our own machines, the costs are up front. There's no uncertainty about what they will do, how they will treat us, or whether they'll run a side-business selling our personal information. They're secure, private, and predictable -- and a huge inconvenience the moment we have to move data around, or share anything, two things the cloud allows us to do with ease.

These two approaches should be complimentary. There's a synergy here that Microsoft (MSFT) is trying to exploit with Office 365, a product that, while more expensive than Google's Web apps, actually drives revenue. Apple (AAPL) embraced connectivity early on with software like the App Store, iTunes, and iChat. Both of these companies believe that the cloud enhances their products, instead of making them obsolete. Android has been successful in smartphones because it's not like Chrome OS, and it didn't try to force a network dependence that would only have run up its customers' phone bills.

Google will be fine even if the cloud fails to live up to its hype. Advertising will continue to drive the company's revenue, as it always has. But for investors who have come to believe that this is the Next Big Thing, the fall to Earth may be a little more painful.

Why companies using the cloud are so happy



RightScale-sponsored survey shows that problems scaring many IT organizations fade away as they actually deploy

Posted taken from David Linthicum | InfoWorld


As reported by my friend and Forbes writer Joe McKendrick, "A new survey finds that roughly one out of four organizations are heavily into cloud computing, and they are providing lessons from which everyone else can benefit." The lessons come from having two or three years of real experience, enough time to see the real benefits and issues.


Keep in mind the study is sponsored by RightScale, a cloud vendor, and it was done in a way to discover the positive, not the negative. It's as if Dunkin' Donuts sponsored a study on breakfast foods. You wouldn't expect to find results related to obesity or diabetes.


[ Get the no-nonsense explanations and advice you need to take real advantage of cloud computing in InfoWorld editors' 21-page Cloud Computing Deep Dive PDF special report. | Stay up on the cloud with InfoWorld's Cloud Computing Report newsletter. ]


Still, the results are interesting and in line with what I see in the marketplace. One finding was that fairly new, bigger enterprises are the leading adopters of cloud. Although the survey of 625 companies found the cloud is "commonplace," 8 percent of respondents wanted nothing to do with cloud.


RightScale used the survey to categorize how many companies fall in its list of cloud "maturity" stages: 17 percent are cloud watchers (no implementations or pilots), 26 percent are cloud beginners (active studiers and perhaps a few pilots), 23 percent are cloud explorers (with multiple pilots and perhaps some deeper cloud deployments, and 26 percent are cloud-focused (with multiple deployments).


Other findings:


? 18 percent of advanced cloud users (cloud-focused) see security and compliance as a challenge, versus 38 percent of the greenhorns.


? 80 percent of advanced-level respondents are seeing faster time to market for applications, versus 25 percent of beginners.


? 87 percent of advanced respondents report they were gaining fasteraccess to infrastructure, compared to 30 percent of beginners.


? Experienced cloud companies don't necessarily have fewer outages, but they're shorter in duration. Because of greater exposure to cloud, 57 percent of the veterans had an outage in 2012, compared to 32 percent of the novices. But the length of an outage at an experienced site was 4.6 hours, compared to 5.8 hours at the beginner companies.


? About 65 percent of the experienced companies reported higher system availability, compared to 20 percent of the novices.


As I'd expect, the more comfortable a company gets with cloud computing, the less daunting the issues identified early on seem. Many of the issues that cause IT organizations to push back on cloud computing -- compliance, security, ownership, and resiliency -- are solvable problems, even though they require a bit of proactive planning.


As more companies gain experience, cloud computing becomes what it really should be: a new platform that has some clear advantages and issues to address. That should feel normal to most of us.


This article, "Why companies using the cloud are so happy," originally appeared at InfoWorld.com. Read more of David Linthicum's Cloud Computing blog and track the latest developments in cloud computing at InfoWorld.com.

12 reasons why public clouds are better than private clouds

Summary: Public clouds have the edge over their internal counterparts in security, reliability, and elasticity, according to the author of a new book on enterprise architecture.

By Joe McKendrick for Service Oriented,ZDNet

To see many of the advantages of cloud computing without its risks, many enterprises are turning to private clouds, which are service layers contained within their firewalls that look and feel like public clouds. But these private clouds may actually be less secure and reliable than the public services.

That's the view of Jason Bloomberg, who said private clouds often add up to more trouble than they're worth. In his latest book, The Agile Architecture Revolution: How Cloud Computing, REST-Based SOA, and Mobile Computing Are Changing Enterprise IT (http://www.amazon.com/The-Agile-Architecture-Revolution-REST-Based/dp/1118409779) , Jason outlined the reasons why public cloud may ultimately be a better choice for enterprises.

You may not agree with Jason's premise about on-premises — in fact, I expect violent disagreement. And this is more of an either/or argument, rather than raising the possibility of blended strategies, such as employing public clouds as test beds, but keeping applications in production within private clouds.

That said, here are Jason's arguments for public cloud and against private cloud:

  1. Private clouds tend to use older technology than public clouds: You may have spent hundreds of thousands of dollars on new  hardware and software, but try getting your organization to agree to that every year.

  2. Public clouds shift capital expenses to operational expenses: It's pay as you go, versus building an entire datacenter, no matter how virtualized it may be.

  3. Public clouds have better utilization rates: With private cloud, your organization still has to build and maintain all kinds of servers to meet spikes in demand across various divisions or functions. Public cloud offers the same spare demand on a pay-as-you-need-it basis.

  4. Public clouds keep infrastructure costs low for new projects: With private clouds, you still need to scare up sometimes scarce on-site resources for unplanned projects that may pop up.

  5. Public clouds offer greater elasticity: "You'll never consume all the capacity of a public cloud, but your private cloud is another matter entirely."

  6. Public clouds get enterprises out of the "datacenter business": establishing private cloud probably gets you in deeper into the DC business than with traditional on-premises servers.

  7. Public clouds have greater economies of scale: No private cloud can compete with the likes of Google and Amazon on price. And the public providers are constantly buying boatloads of the latest security technology.

  8. Public clouds are hardened through continual hacking attempts: Thousands of hackers have been

  9. Public clouds attract the best security people available: They seek out the top security experts, will pay them top dollar, and treat them as the most important part of their businesses, which they are. Do traditional enterprises treat security teams this way?

  10. Private clouds suffer from "perimeter complacency": "If it's on the internal network, it must be secure!" 'nuff said...

  11. Private cloud staff competence is an unknown: Your organization may have a lot of talented and knowledgeable people, but is data security the main line of your business?

  12. Private cloud penetration testing is insufficient: Even if you test your applications and networks on a regular basis (which man organizations don't), these only tell you if things are secure at that exact moment.


 

10 reasons to outsource your cloud

By Jack Wallen

Takeaway: If you’ve been thinking about bringing your cloud in-house, make sure you’ve considered the potential downsides of that approach — as well as some of the benefits outsourcing might offer.

I recently offered 10 reasons why you should keep your cloud in-house Many companies feel strongly about keeping that service within the walls of the office. But if you’re on the fence, it’s wise to look at both sides of the issue before making such a game-changing decision.

It’s the cloud. What was once considered an unknown, intangible technology is now a solid service that organizations have come to rely upon. But should you outsource this business-critical service? This time around, let’s examine 10 reasons why you should.

1: Cost

Many businesses simply don’t have the cash on hand for a one-time payment large enough to bring the cloud in-house. This cost would include hardware, software (if not going open source), a larger data pipe — it all adds up. For those organizations, the idea of using a host means a much lower up-front cost. You’ll be paying a monthly fee for the life of your cloud. But if you can’t pony up the thousands of dollars up front, this might be the wisest option that will get you the most cloud for your buck.

2: Security

Do you have the level of security that Google, Amazon, or any of the other, bigger, cloud providers do? Yes, it does mean that the safety of your company data could be in the hands of a third-party solution — but isn’t that the case anyway? As it stands, you’re already relying on companies like Microsoft to keep your data safe. In most cases, your data is probably more secure when trusted to the likes of big companies like Google and Amazon.

3: Reliability

If you can’t provide failover for your cloud, your best bet is to outsource it. The last thing you need is to have employees (or worse, owners) out and about and unable to reach the cloud data. If this is even remotely a concern, the outsourcing of your cloud is probably the best decision. With most cloud services, you can rest assured that they have put plenty of cash and effort into making sure uptime is as high as possible. If you can’t offer that same level of reliability, outsource.

4: Staff

Do you have staff with the training to not only maintain, but build a cloud environment? It’s not as easy as sharing a directory on a server. A cloud topology is a specialized area, and if you don’t have staff that can build and manage a cloud, your best bet is to outsource. Sure, you might have one or two IT staff members with the skill to set this up. But do you want to take them off their normal projects and have them on permanent “cloud duty”?

5: Applications

Many cloud services do not just offer sync and storage. Some, such as Google and Zoho, also offer plenty of applications and services that could greatly benefit your company. If you house your cloud, you most likely won’t benefit from such services and applications. Zoho alone offers add-ons to the standard cloud that you won’t find anywhere else.

6: Access

If you host your cloud, you’re dependent upon the pipe going into your business as well as the ability of your staff to keep the service running while keeping your network secure. Most likely you’re already working with a VPN — will that cloud interfere with it? Will the cloud fubar the shared directories and drives on your network? These are all questions that become moot when the cloud is outsourced.

7: Infrastructure

Do you have a data pipe that can handle the incoming traffic as well as a rack to hold your servers? Do you have a cooling infrastructure to keep your cloud running smoothly? Do you have backup generators and fire protection? Probably not. If you’re serious about your cloud, these are things you’ll need. The best route to a cooled, protected cloud rack? Outsource.

8: Mobile

Have you ever thought about what it would take to relocate or co-locate your business? If you outsource your cloud, this task is made easier simply because, even during the move, your employees will have continued access to the data in the cloud. If you use a service like Google Apps or Zoho, employees can even continue working while your company is in the process of relocating. Even though you may not be considering a move at the moment, you never know what the future holds.

9: Power

Does your building have the necessary power to run a data center? Is the power clean? Does it run reliably? If not, you should seriously consider outsourcing your cloud. The last thing you need is to deal with a power outage on a cloud. Not only are you cutting the umbilical to data that many users access, you’re also running the risk of losing that data. On top of that, can you afford to pay the extra cost to run the necessary power for these servers (or the extra monthly charges to your electricity bill)? They might seem like silly questions now… but it all adds up in the end.

10: Support

Do you have the staff to support your cloud or would you rather enjoy the security of knowing that if something goes wrong, you have a dedicated support staff for your cloud? By not having to funnel your current staff to yet another project, you don’t have to worry about either hiring more staff or maxing out your current staff so they can’t complete their regular workload.

Pros and cons

Ultimately, the decision must come down to what is best for the company. There are many factors involved, but if you go through this list and still feel confident that you can keep your cloud in-house, I wholeheartedly would support that decision. But if this list gives you pause, make sure you carefully evaluate the pros and cons. Make a wise decision on your cloud and it will pay you back over and over.

 

What makes up a true hybrid cloud infrastructure?

By Thoran Rodrigues,TechRepublic

Takeaway: Extending the benefits of the public cloud into the private infrastructure is an appealing option for companies who still have reservations about the public cloud.

When talking about cloud computing, we will usually find two perspectives: those who believe in the concept of “private cloud computing”, and those who don’t. The people that don’t believe in the private cloud think that only on the public cloud can the benefits of cloud computing - unlimited scalability, pay-per-use, increased scalability - be fully realized. On the other side, those who believe in the private cloud feel that the potential risks of the public cloud outweigh its benefits, but that their own internal infrastructure can benefit from being more cloud- like.

The problem lies in the fact that both sides are right in their own way. There are, in fact, some benefits that the public cloud offers that cannot be replicated in a private environment. Scalability, for instance, is limited by the total available hardware, and will, for most companies, be smaller than if they were relying on external providers. It is also true, however, that for some applications and in some situations there are complex risks associated with the public cloud. There are still some questions regarding data protection and privacy that are somewhat unclear, and that makes some companies hesitant when looking at the cloud.

Enter the hybrid cloud

The hybrid cloud has arisen as middle ground between these two points of view, trying to combine the benefits of public cloud offerings while attempting to avoid the risks that companies see associated with it. The idea behind a “hybrid” cloud is exactly what the name implies: a mix between in-house and public infrastructure. From the NIST definition of cloud computing:

The cloud infrastructure is a composition of two or more distinct cloud infrastructures (private, community, or public) that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load balancing between clouds).

There are a few of extremely important details on this definition. First we have the fact that the “clouds” that are combined to make the hybrid deployment remain unique and distinct entities. This is important because it allows, for instance, companies to store their sensitive data on a private cloud, without ever exposing it to the outside, while still employing external resources to run applications that rely on this data.

Second, we have the fact that the hybrid cloud is necessarily composed of multiple cloud infrastructures. It isn’t enough to simply connect a server in your data center to some cloud resources to claim you have a hybrid cloud deployment. The private side of the hybrid cloud must operate in a cloud-like fashion, otherwise it isn’t a real hybrid cloud.

Finally, the private and public side of the cloud infrastructures needs to be linked in a way that allows the private infrastructure to take advantage, when necessary, of the resources of the public cloud to run tasks or store data. This is perhaps the most important aspect of any hybrid deployment: if your private infrastructure can’t take advantage of the capabilities of the public cloud, then you don’t really have a hybrid deployment. In fact, it doesn’t even make sense to go hybrid unless it is exactly to extend the benefits of the public cloud into your private infrastructure.

Meeting in the middle

The hybrid cloud, then, is about reaching a middle ground: about companies recognizing the benefits and the potential of the cloud, and looking to exploit it; and about cloud providers recognizing that, especially for large companies, a move to the cloud has to take into account existing infrastructure and other restrictions that may apply.

It’s also interesting to see the shift in the discourse and mentality displayed by major cloud providers in this respect. Most of them now have launched or are about to launch “virtual private cloud” services, which are the basis for the secure connection of private and public cloud environments, in order to further entice enterprise customers.

NEWS ANALYSIS,Cloud applications gain ground in corporate IT

Brian McKenna,ComputerWeekly.com

Cloud applications are starting to gain critical mass in enterprise IT, according to cloud application providers.

While this can be dismissed as the special pleading of the self-serving, senior executives at Informatica and Birst couch their arguments with caution in a brace of recent Computer Weekly interviews. And, on the user side of the house, the Corporate IT Forum is registering a shift in focus from “should IT move to the cloud?” to “how?” and “in what areas?”



Head of research at the blue chip user organisation, Ollie Ross says: “The Corporate IT Forum's 2013 Cloud Computing Reality Checker has clearly shown that, for large enterprises, the challenge is now how to use cloud, not if they should use it.”

Juan Carlos Soto, senior vice-president and general manager of cloud integration at data integration supplier Informatica, says “hybrid IT” is the critical concept in assessing how, and to what extent, corporate IT is moving to the cloud.

“Companies starting from scratch will be cloud-based from the first. Some companies will never move to the cloud. And the rest will be a blend of cloud and on-premise,” said Soto.

He said the early attitude of corporate IT was often of ambivalence or neglect. It was more lines of business who were early adopters. But now, at least in the US and western Europe, “IT has gone from a purchase approver to driving the transition. ‘Cloud first’ is a key policy for new projects in 2013 and 2014. The exception is not to include cloud,” said Soto.

READ MORE ABOUT CLOUD APPLICATION ADOPTION

He cites the US government’s cloud first policy, announced in late 2011. Soto himself was selected for the TechAmerica Foundation’s "Commission on the Leadership Opportunity in the US" US Deployment of the Cloud (CLOUD²), which produced the Cloud First, Cloud Fast: Recommendations for Innovation, Leadership and Job Creation report, submitted to the Obama administration in July 2011.

From that experience he learned, he said: “How far reaching the transformation from cloud can be, in terms of making the US more competitive globally.

"And while security has to be paramount, we need to make sure we are not limiting innovation artificially by not leveraging cloud. It was eye-opening beyond the technology issues.”

Tail breeze getting stronger

Brad Peters, CEO of cloud-born business intelligence supplier Birst, said of cloud analytics adoption, “there is no huge headwind, but the tailwind is getting stronger”.

Peters ran the analytics division at Siebel before Oracle acquired the company in 2006. His view is that the legacy, client-server, on-premise model makes business applications expensive and hard to consume. And so, in relation to operational enterprise applications such as ERP, CRM and HCM, the general market shift has been to cloud deployment, “because these applications are engineered for delivery”.

His company’s approach has been to offer an entire business intelligence infrastructure, including back-end integration, as opposed to a data-discovery software suite, in the cloud.

However, he said: “We are about automating the process of business intelligence, therefore it does not matter where we host it, so it can be on premise, as an appliance, too.

“Although operational applications are going to the cloud, 98% of all enterprise data is still on-premise. So we will be in a hybrid world for a long time”.

But he does see a tipping point hoving into view.

“A year ago, we were having to make the case for cloud. Loss of control was outweighing factors like scalable infrastructure, reliability, disaster recovery. We never saw anyone asking for cloud only," said Peters.

"But now the tail breeze is getting stronger. If you look at the top end of customers for Salesforce, Netsuite, Marketo, and so on, they are really big companies. When you went to Dreamforce two years ago, you would be hard pressed to find a big company. Now they are there.”

Keeping control

Peters also maintained that data security in the cloud is a red herring. “The cloud is like a bank. Do you put your money under your pillow or in a bank?” Most data breaches are internal, he pointed out.

The issue is more about loss of control for IT. “That is real and valid. So, we add APIs and layers to give similar levels of control to IT, even when we host their data.

“The argument we make to IT is ‘don’t spend your time on the infrastructure, the plumbing. Spend more time on the modelling and manipulation of data’," said Peters.

"That is where IT is strategic. That is the sexy stuff – the analyst role, where you are interfacing with the business. And the business doesn’t have people who can understand data the way IT does.

“You don’t have to be a data scientist, doing predictive modelling. But data modelling in general, translating what the business wants into models – IT professionals get that.”

 

How Cloud Computing Changes Enterprise IT Economics

By Bernard Golden,Computerworld


CIO - After a recent speaking engagement, in which I focused on creating a hybrid cloud computing strategy, an attendee approached me with a question. "How," he asked, "can I show that our storage is less expensive than AWS?"

When I asked him to elaborate, he outlined this challenge. His group installed a significant amount of storage a year ago, based on an estimate that the installation would support the growth of the company for the next five years.

Now, less than a year later, the new storage is nearly full. He explained that business users want to offer new capabilities to customers; one example is providing customers the capability to view their invoices for the past year online. Unfortunately, these new capabilities consume far more storage than planned for in the upgrade.

This isn't unusual. The increased expectations of business users-based on watching what competitors are delivering online, not to mention the amazing revolution in applications commonly referred to as the consumerization of IT-are consuming far more computing and storage resources than anticipated. This makes the traditionally challenging area of capacity management even more difficult.

Traditional IT Department No Longer Tenable

Put bluntly, the traditional assumptions of IT-stable workloads and predictable growth-are no longer tenable, having been undone by increased business process expectations and the accelerating rush to digital applications as the primary method of customer interaction.

These trends dislocate established IT economics and present IT groups with a financial challenge-one that threatens to topple their position as monopoly supplier of computing to the larger enterprise.

You can see this by examining the dynamics of this situation.

First, the fact that a "five-year storage purchase" is nearly exhausted after a year calls into question the basic competency of IT. IT can respond by saying that business users have increased demand beyond what was foreseeable, but that won't really hold water. In any case, nobody cares about the why; they just know that what was billed as a five-year solution became a 10-month solution.

Second, users will be frustrated that they need to secure sufficient resources to address their business requirements. Even if they're willing to pay the attributed cost, they may find that IT cannot deliver resources to address their needs.

Frustrated by the unavailability of resources required to fulfill their needs, their options to obtain resources are as follows:

Forego obtaining sufficient resources and forfeit the business opportunity those resources would enable.

Lobby senior IT management to get their request put at the head of the list. This lets them go forward with the business initiative, but it also fosters disrespect for the existing process and injects an unhealthy atmosphere of currying favor into resource distribution.

Escalate resource requests to senior corporate management to force IT to direct resources to the successful petitioner. This disempowers IT and corrodes respect for it.

Third, IT is now placed in the unenviable position of having to ration a limited resource among competing demands from user organizations. In other words, IT is now seen as a roadblock, which sets the stage for resentment and frustration.

In the past, this dynamic would lead to an urgent request to the company CFO, asking for more capital to address the resource shortage. Depending upon the CFO, the request might be granted-or it might not, thereby extending the business bottleneck and increasing the ongoing tension between users and IT.

Admit It: Cloud Services May Be Less Expensive

Fundamental to this dynamic is the bedrock assumption that IT is the monopoly supplier of infrastructure to users. That assumption, of course, is no longer accurate, thanks to the rise of Amazon Web Services and other cloud providers. Today, users can avoid the entire resource-rationing contest and go direct to a provider with effectively unlimited resources.

IT is now confronting a world in which its long-established role as sole supplier is no longer plausible or even appropriate. Faced with this new world of increasing demand and alternative sources of supply, how should IT respond?

Unfortunately, all too many echo the approach of my questioner: An assumption that IT must be less expensive than other options and a fevered search for a tool to "prove" the assumption. Note that he didn't ask, "How can I understand my cost versus a cloud provider?" Rather, he asked, "How can I show that our storage is less expensive than AWS?"

I responded by observing that most IT organizations dont really understand their true, fully loaded costs. I recommended he look into a tool such as Apptio to ascertain the total and then compare his organization's costs and the costs of other alternatives. He took no notice the recommendation and repeated his request for something to "prove" that running its own storage would be cheaper for his organization than using AWS.

This response is unfortunate-and it's going to lead to significant heartache for IT groups in the future. A reflexive insistence on retaining the position of monopoly infrastructure supplier, and a refusal to look to outside suppliers to provide sufficient extra capacity, consigns IT to an ongoing power struggle with users.

It amazes me how many IT organizations persist in assuming-absent any concrete, credible evidence-that they are the low-cost supplier of computing resources. The downside of this approach is that it will render IT irrelevant as users bypass it to directly provision their own cloud-based resources.

A far better approach is to recognize IT's role: Enabling computing to be performed on behalf of the larger company, with selection, infrastructure and application management, along with cost-effectiveness, the domain of IT. Computing is the objective and infrastructure is the mechanism that supports the objective. In the past, the only way to fulfill that objective was to own and operate infrastructure. Today, owning and operating is no longer a prerequisite to obtain computing services.

Savvy IT organizations will recognize the real question: What's the most cost-effective vehicle to obtain computing resources, no matter where they reside? The answer will dictate the ownership and management of the infrastructure.