Salesforce Wave: Cloud-based analytics that 'should feel like Angry Birds'
Salesforce execs thrown into the lion pit of a press Q&A session. SVP analytics says beta testers laughed when they used Wave
Salesforce Wave, the new cloud-based analytics service from CRM firm Salesforce.com, should feel like popular mobile app Angry Birds.
That's one of the more surprising findings to come out of a Q&A session at Dreamforce in San Francisco today. The top questions and most interesting answers from the session are below.
The answers were provided by Keith Bigelow, SVP & GM Salesforce analytics cloud.
Q: Why should customers choose Wave and not Tableau? Keith Bigelow: Our whole brand is about customer success. We found in our beta programme for Wave that customers came to us from legacy BI vendors. Platforms which were built in the late '80s. Some of these platforms might be German, and you might know who I'm talking about here [SAP]. Some even came from Tableau, and they came to us and said that they're not achieving their goals. It's all about trying to achieve something they couldn't achieve right now. So it's customers who weren't achieving their goals who reached out to us, and our customers were begging us to enter this market.
What sort of company should use Wave? Our installed base, customers who have already chosen Salesforce are great examples of the type of firm who will choose this. They've already realised benefit of cloud, and the divisions within that company will be sales, service and marketing although we've done finance and other groups in the pilot. They want to change the time-to-value equation for analytics applications. Especially when it comes to behavioural, CRM data, multi-party, multi-type data. If you want to know all the behaviours of your customers in order to sell to them better, those are the scenarios which resonate right now with our platform.
How do you differentiate from the competition?
The platform is the huge differentiator. How we store data is also key. We store the data in key value pairs, then it's compressed and encoded, which allows us to store masses of data. That capacity combined with taking search-based Google-like approach, as opposed to the traditional relational approach from late '80s, is a major differentiator. If I tried to sell product like a notebook that might have five to 15 attributes, and a Dell laptop which might have thousands, and you store those records in a relational database, then you have 800 columns. You have five to 15 entries in one row, and 800 in the next, so you won't be able to ask questions of all of those attributes. You have lots of waste in the catalogue and you're loading structures in the database which are largely empty and inefficient.
How do you handle non-native [ie non-Salesforce] data? We have 2,400 app exchange customers, that data is in the Salesforce cloud and is natively available to any analytics. We also have six data integration partners, like Informatica. They're optimised so rather than targeting a legacy data warehouse, they can target the ETL [Extract, Transform, Load] workflow straight into the data analytics cloud. You can mash that data together easily. That's not just Salesforce data or data from our other partner apps, but all of their connectors as well, so for example it's useful for financial services where Informatica has brilliant adaptors to that data.
How close can customers get to real-time analysis?
That depends on what data you're referring to. Some people want data in a high frequency, others want data sets at a lower frequency, even daily loads. It's up to the customer to choose the frequency.
How did the test pilot customers react to the product?
This is the first product I've created when users giggled, they laughed when they use it. We changed the interface to optimise for them since their feedback. It should feel more like Angry Birds than getting angry at IT because you can't get the report you want.
Do you see this competing against Splunk and others?
We're highly focused on our install base. We've seen some discussion from other vendors saying we're late to market. We have a massive install base clamouring for this, and we're highly focused on taking this to them first. We don't have a competitive attack programme focused on other competitors. The customer will make the decision.
Some customers' Salesforce data isn't very clean - will this be a catalyst for them to clean up that mess, and what kinds of tools are you offering to help?
This is about data governance and data quality. We have a group called Data.com, which is in the business of both cleansing and augmenting data, and removing duplicates. If data is coming from outside our landscape, so if it's not something we can easily run through our system, we have partners like Informatica who are great at that, they can do a cleanse and augment, then bring the data into our system - that's the non-native data coming from outside Salesforce.
Lots of data being generated just looks normal, but firms are usually more interested in the exceptions, so what are plans for those scenarios?
We can bring in any of those types of data, and once you bring it in to the Salesforce analytics cloud, the way it takes in data, it cleaves it off in chunks so we keep the speed up, along with the scale and performance. What the data is actually used for is very specific case by case.
But there is an exception-based approach where you bring in just the deltas that are important. Or you might want all the grain data with all the detail. Often it's 99.94 per cent green, then the rest red, but you want all the details so you can trend over it. So it comes down to the use cases and we have products for both situations.
Do you use the Oracle database for all types of analytics?
If you already use Salesforce to produce reports and dashboards - then all of that is in Oracle. If you said ‘I'm going to deploy analytics cloud and I'll analyse sales data', then we remove that data from Oracle, as the relational structure is insufficient for the performance we want. We move it into the analytics cloud, it's actually in the same data centre, but not in Oracle. We manage the transition ourselves, to put it into our compressed engine so we can quickly load those volumes of data. That does not count against your service cloud system, so there's no harm or impact to the customer in their storage requirements. The database we use for that is a Salesforce NoSQL store, we own the stack.
With analytics cloud, are there limits to the speed and performance you can achieve with data?
The honest answer is there are always limits. In terms of how do people bring data in, like sensor data for example - we have an API for customers and integration partners to load to. That allows you to stream far larger datasets into the system. As you approach several billions of rows, that will impact the cost you have, but from a scale perspective we don't limit customers and say stop at 100 billion rows, we say bring it on in.