The Data Management Survey 20 is Coming

BARC analyst Timm Grosser

Timm Grosser, BARC’s Senior Analyst for Data Management, looks forward to the publication of BARC’s second annual Data Management Survey in September.

BARC’s second annual Data Management Survey will be published this year. On Monday, the field phase (which lasted several months) ended and it’s now over to me to work through the data and write up the findings.

In the survey, users’ opinions on various tools were collected. The survey compares tools for data management based on aggregated user ratings. It is up to me to make the ratings comparable and to comment on them from an analyst’s perspective. I think that’s what distinguishes the survey: a combined view from true user and analyst opinions. And this year we have the views of around 700 participants to draw upon.

Those of you who are familiar with The BI Survey will quickly find your way around the methodology and evaluation methods.

Peer groups

To ensure that the tools are comparable, we have created comparison groups, which we call peer groups. In these groups, the tools are compared and ranked according to 12 KPIs. Currently, we use the following peer groups:

Peer group Description
Data warehouse technologies Data warehouse technologies prepare, store and provide data for data warehousing purposes
Analytical database products Analytical database products prepare, store and provide data for analytical purposes.
Data warehousing automation products Data warehousing automation products cover data-driven or requirements-driven data warehouse design and implementation. They mainly focus on the simplification and automation of data integration and data modeling tasks.
Data management products Data management products are tools that help to connect, transport, transform, prepare and enrich, monitor and protect data.
ETL products ETL products connect, extract, transform and load data from various source systems to a target system for analytical purposes.
Global vendors Global vendors have a sales and marketing reach through subsidiaries and/or partners which gives them a truly global presence. They are present worldwide and their products are used all around the world.

Products included

Survey responses were received for numerous tools. To be included in the published results, however, 30 answers are ultimately necessary in order for us to be able to present empirically robust results. That leaves use with 12 products, shown below along with their corresponding peer groups.

Data Management Survey 20 products

The KPIs

Within the peer groups, we analyze buying reasons, challenges with the products and the KPIs. We have revised the KPIs since our inauguaral edition last year. This year, we have agreed on the following 12 KPIs:

  • Developer efficiency
    Based on how users rate their tool in terms of developer productivity, e.g., for testing, deployment, reusability, ease of coding and use of metadata.
  • Time to market
    Based on how users rate their tool in terms of adaptability (agility to adapt to new requirements).
  • Innovation power
    Based on how users rate their tool in terms of innovative strength (amount of innovative functionality in the tool, market trend adoption time and rate).
  • Price to value
    Based on how users rate their tool in terms of price-to-value ratio.
  • Performance
    Based on how users rate their tool in terms of performance (query performance, load performance, processing performance).
  • Platform reliability
    Based on how users rate their tool in terms of platform reliability (i.e., stability, functional reliability, monitoring capabilities).
  • Support quality
    Based on how users rate their tool in terms of support quality (e.g., availability, geographic coverage, support channels, effectiveness and efficiency, reaction time).
  • Openness
    Based on how users rate their tool in terms of openness and integration options (connectivity to data sources and interfaces for integration with other applications).
  • Breadth of supported use cases
    Based on how users rate their tool in terms of the range of use cases it supports.
  • Functionality
    Based on how users rate their tool in terms of functionality (i.e., capabilities and functional scope).
  • Product satisfaction
    Based on the proportion of users that say they are satisfied with their product.
  • Recommendation
    Based on the proportion of users that say they would recommend the product to others.

In the meantime…

I am currently coordinating briefing dates with the relevant vendors. I need the briefings to get a deeper understanding of the user ratings per KPI. Sometimes it seems there is a gap between expectation and implementation. In addition, there will be a short description of each vendor and tool included in the survey, so the briefings ensure this information is up-to-date.

The survey is due to be published in September this year and until then there is still a lot of analysis and writing work for me to get through. But I am looking forward to digging into the products and survey results in detail.

To keep you in the loop until we release the full results, I plan to write a short summary for each briefing, like I have done for Dremio and Silwood Technologies in recent weeks.