At a glance
Legacy analytics can hamstring cloud success
As enterprise customers migrate their enterprise ERP, CRM, and HCM data to the cloud using Snowflake, they need an agile solution for business intelligence to serve all of their user requirements with a minimum of complexity and cost.
Unfortunately for many companies, delivering enterprise data to business users still requires a major commitment of resources involving weeks of ETL coding, testing, and iterations between domain experts and data engineers. During this time, no value is being delivered, while big assumptions are being made about what users really need.
And, when all this work is done, actually putting the data in front of decision makers who can use it means even more discussion, iteration, and training. Despite over 20 years of claims around end-user self-service, business users at the end of legacy data analysis pipelines still get predetermined views of the data, which are delivered to them only with enormous effort. They are unable to drill-down into detailed data views, and are unable to freely explore new data pathways as business conditions change.
There must be a better and faster way to curate enterprise data, and provide it to every decision-maker who can use it with speed and simplicity.
The first breakthrough in analytics in over 20 years
Many analytics use cases work fine against aggregated data, and this has been the usual model for decades. However, some use cases such as supply chain, healthcare, and insurance analytics greatly benefit from the ability to seamlessly drill-down to transaction-level details in order to do root-cause analysis.
This means having the detailed transactional data available to the analytics system, which in the past has meant costly custom programming and limited use cases.
In addition, for any type of business reporting, running summary and detail reports against different data sets or against different systems almost always leads to inconsistency.
The Incorta Direct Data Mapping™ engine delivers very fast query performance without the need to pre-aggregate, reshape or transform the data in any way. Summaries and rollups are computed on-the-fly against full-fidelity business data.
This means that Incorta’s analytics engine provides both aggregates and very fast drill-down capabilities against a single data set, eliminating inconsistencies. This makes exploration, discovery, and drill-downs a breeze, and allows users to find individual transactions and immediately take corrective action.
Incorta allows users to ask any question, and explore any direction immediately, even against billions of records, without having to wait for new models or ask for new data sets.
For any analytics use case that can’t be done on aggregated data, Incorta accelerates any BI tool, all without copies, transformations, cubes or optimizations of any kind.
A typical deployment
One of Incorta’s key customers uses Snowflake as a single cloud repository of all historical and contemporary business data including periodic snapshots. The Data is cleaned, harmonized and enriched but kept in 3rd Normal Form for maximum fidelity and flexibility. The Snowflake database takes the place of a Data Lake, providing massive storage capacity plus the flexibility of handling JSON and other unstructured data.
The Incorta Snowflake Connector provides access to all Snowflake data, transferring the data needed for analysis every 15-minutes into a disk-based cache. Direct Data Mapping analyzes the data upon landing, and the data is compressed and loaded into the inmemory analytics engine.
Incorta provides their 3000-user community with true self-service data discovery and analytics, along with pre-configured dashboards and reports to data consumers. Business analysts utilize their own 3rd party tools against Incorta’s query engine, enjoying both vastly accelerated queries, but also the ability to drill into transactional details at-will. This customer’s IT group has effectively removed themselves from the drudgery of providing simple reports and dashboards, and has been able to handle increased user loads while keeping a lid on project costs.