Tableau Shifts from Visual Analytics to 'Complete Data Play'


AUSTIN — Texas may be known for its skilled horse and cattle handlers, but it is data wrangling that visitors to the state capital are talking about this week. More than 13,000 data enthusiasts from all over the world have descended here for Tableau’s annual user conference.

This is a passionate, loyal crowd that has what Constellation Research analyst Holger Muller called “an almost cult-like dedication to the vendor and its products.” 

It is a group that keeps growing. Last week, in a call with investors, Tableau Chairman and Co-Founder Christian Chabot reported the company’s highest ever quarterly revenues and an expanded customer base of more than 50,000 accounts worldwide.

While figures like that may make investors happy, the crowd here this week is more interested in Tableau’s new tools and how they can use them to get a better handle on their data.

Tableau’s Strategic Shift

People who aim to make information visually meaningful are unlikely to be disappointed. Ditto for end users who want to make sense of the volume, variety and velocity of data coming at them.

CMSWire caught up with Tableau’s Chief Product Officer Francois Ajenstat before he stepped onto the keynote stage today for a preview of Tableau’s product roadmap.

Ajenstat wouldn’t single out any specific new product, noting “I love all of my children equally.” 

But he did cite an important strategic shift, “We are broadening Tableau from a visual analytics product to a complete data platform.”

In other words, Tableau customers will be able to come to one vendor to connect, explore, prepare, manage, analyze, share and govern their data instead of using multiple vendors. While Tableau has no intention of stepping on its partners toes, “there will be some overlap,” Ajenstat admitted.

Tableau’s Product Roadmap

Tableau user conference, AustinHere are some of the innovations Tableau executives, including new CEO Adam Selipsky, will talk about today.

HyPer Delivers Analysis Without Limits: Yesterday’s data engines weren’t built for today’s world, where the ability to analyze data barreling in from the Internet of Things (IoT) and data streams is quickly becoming a must. Tableau acquired HyPer, an in-memory-based relational DBMS for mixed online transactional processing( OLTP) and online analytical processing (OLAP) workloads in March. As Tableau’s new data engine, HyPer allows for interactive analysis on large and small data sets and enables fast data ingestion for near real-time analysis.

This is a game changer, according to Ajenstat, because users will be able to glean insights based on billions versus millions of records and they will be able to do it just as quickly. “Everybody can go big, but how fast can you get a response is the important question,” Ajenstat told CMSWire.

Tableau Gets into Data Prep: Until now some of Tableau’s most demanding users have gone to vendors like Alteryx, DataWatch and Paxata for data preparation. Soon that may not be necessary. This morning Tableau announces the addition of self-service data preparation to its suite of products. Its code name? Project Maestro. If you’re a Tableau customer, click on the link to see it. Ajenstat was careful not to go toe-to-toe with the aforementioned providers, suggesting that their customers might benefit from a hybrid approach.

Talk to Your Data: Tableau has championed the idea of democratizing data by delivering free, easy-to-use products for building data visualizations (vizzes) like Tableau Public and Vizable. Today they are looking at “data democratization” in a whole new way. “We are using Natural Language Processing (NLP) to bring new ways to interact with data through human language such as voice and text,” said Ajenstat.

Self-Service Analytics: We’re not always at our best when we’re looking for discoveries in data. We look through data sets, check things out, or we just take them. We don’t always put them back or document our footsteps, creating the potential for data breaches, and other problems, every step of the way. Better governance needs to be built in. Tableau seems to be doing exactly this using a new functionality in Tableau Server that certifies data sources, conducts impact analysis on sources and workbooks and is able to promote content and write workflows with simple drag and drop gestures.

Machine Learning Recommendations: Data workers look to discover answers and make recommendations from every possible angle yet, somehow, there is almost always something that they miss. Tableau intends to keep this to a minimum via its Tableau Machine Algorithms which will surface recommendations for workbooks and data sources that are trusted, highly used and contextually relevant to individual workflows.

So Much More

Those are just some of the highlights from what is being discussed on stage. You can watch it for yourself here.


Source link