DEEP DIVE

Migrating BI Tools, Part One - Assessing the Reasons and Risks

Evan Rusackas

In the ever-evolving world of analytics tooling, and more specifically Business Intelligence (BI), staying ahead often means embracing change. With the rapid shift in the BI landscape, fueled by significant acquisitions, the rise of open-source platforms, and the need for modernization, many organizations are considering the vital but challenging process of BI tool migration.

Obviously, transitioning from one BI tool to another is no simple task. It involves more than just moving data; it's about navigating the intricacies of chart property mapping, understanding differences in dashboard design, and ensuring as seamless a shift as possible for end users. Not to mention the economic considerations and the desire to break free from vendor lock-ins.

In this two-part blog post (part two available here, we will deep-dive into the compelling reasons for migration, the complexities it entails, and how to strategize your move. This is not a surface-level guide, but a comprehensive, strategic blueprint for those ready to tackle a BI tool migration head-on. We'll focus on a phased approach, highlighting the importance of prioritizing key assets, managing user transition, and leveraging migration in negotiation. So, whether you are on the brink of migration or in the process, this guide will provide actionable insights to aid your transition journey.

When and Why to Migrate BI Tools

The following are some of the most likely catalysts for migrating BI tools:

  1. Functionality and Features: If your current BI tool doesn't have the required features or functionality, such as forecasting, particular data visualizations, or integrations with your data stack, the team might consider switching to another tool that provides these features.
  2. Cost: BI tools can be expensive. If a more cost-effective tool that meets the organization's requirements is available, the team (or much of the team) might opt to migrate to it. This could also include the cost of renewing licenses, costs associated with scaling, or unexpected expenses due to a lack of necessary features.
  3. Ease of Use: Some BI tools are more user-friendly than others. If the current tool requires significant technical expertise or shows down a common workflow, the company might choose to switch to a more intuitive, user-friendly, or efficient tool, reducing the time and resources spent on training and time spent on frequent workflows.
  4. Integration with Existing Systems: If the current BI tool isn't integrating well with the company's existing systems (such as databases, data lakes/warehouses, data catalogs, reverse ETL processes, etc.), it could become essential to make a migration. This is especially true if the rest of your data ecosystem and workflows are evolving alongside your BI tooling. A tool that allows for a smoother integration can help streamline operations or even entirely unblock data teams.
  5. Scalability: As a business grows, its data needs tend to evolve and expand. If a BI tool cannot scale or adapt to increasing data volumes or user numbers, a more scalable solution may be needed.
  6. Performance: If the current tool isn't able to deliver the required speed, performance, caching, or data processing capabilities, it might be necessary to move to a more powerful tool.
  7. Vendor Support and Community: If a vendor's customer support is lacking, or if the tool lacks a vibrant community for help and shared knowledge, these could be reasons to consider a switch.
  8. Security: If the current tool does not provide robust data security features, a company might switch to a tool that has stronger security mechanisms, particularly if the company is dealing with sensitive data.
  9. Regulatory Compliance: Some businesses operate in heavily regulated industries, where data handling and processing need to comply with specific laws and standards. If the current BI tool doesn't support these compliance needs, the team might have to switch to a tool that does.
  10. Technology Updates or Migration: With the rapid evolution of technology, sometimes older tools become obsolete or are no longer supported. Companies might need to migrate to a newer tool to stay current and gain access to advanced capabilities.
  11. Workforce preference: As the competitive landscape and internal user base evolve, different people in your organization may prefer certain tools over others, as they may be more targeted to their level of sophistication or more suited towards a certain vertical.

The Wake of Acquisitions in the BI Market

The BI product landscape has been in a state of flux in recent years, with a number of major acquisitions and changes in ownership. In 2019, Salesforce acquired Tableau, and Google acquired Looker. These acquisitions have been interpreted by many as a means for larger companies to gain a foothold in the BI market and compete with Microsoft Power BI.

In other market moves within recent memory, Chartio was shut down last March following an Atlassian acquisition, forcing their users off the platform. Periscope Data merged with Sisense, leaving the future of both products somewhat uncertain. Most recently, Mode Analytics was acquired in June by ThoughtSpot. Each of these vendors was once seen as an innovative startup, but they have struggled to compete with the larger, more established players in the market, and their user bases have been forced to make pressing decisions about the future of their business intelligence solutions.

Overall, the recent changes in the BI landscape have created an environment where businesses are more likely to consider migration due to a perceived slower pace of innovation, a lower quality of support post-acquisition, or in the worst case a fast sunset. Open-source BI products like Apache Superset side-step these concerns, and the additional benefits of affordability, customizability, and community support, are making them a more attractive option for businesses.

The Danger of Vendor Lock-in

Vendor lock-in represents a substantial risk for customers as it ties their operations tightly to the future of a specific platform. This situation exposes them to a range of uncertainties, including sudden changes in pricing, termination of services, declining quality, or alterations in platform strategy that may not align with their objectives. More so, the use of proprietary modeling schemas and tooling, such as LookML or Tableau Extracts, may intensify data portability issues. Consequently, this can lead to considerable economic costs, as well as labor costs due to the significant rework needed to migrate or adapt to other systems. Open-source software offers a compelling alternative, enabling easier migration between commercially hosted or self-hosted applications. By embracing open standards and transparent tooling, customers can ensure data portability, minimize the risks and costs of lock-in, and retain more control and flexibility over their projects for the long haul.

To make matters worse, many vendors also tend to sell companies on their entire monolith of tooling in “vertical sprawl” to lock you in further. Open source software offers a collection of discrete tools that play well to their strengths, offering “best in breed” or more fitted solutions, allowing you to build a healthier, diverse, fitted, and portable ecosystem of tools.

Democratizing Access… For the Long Run

Make no mistake, the future of BI (and indeed the entirety of the Modern Data Stack) is increasingly moving towards open source, with a number of key factors driving its long-term success:

  1. Democratization of Access: Open source software is typically free to use and modify, thus, allowing more individuals and organizations to access and analyze data without requiring large budgets. This democratization can enable many small and medium businesses to leverage BI tools that they may not have had access to otherwise.
  2. Distributed/democratic governance: The best open-source projects have clearly defined, distributed, evolutive, and diverse governance. This is the main value provided by orgs such as the ASF. This ensures that any organization or individual can influence the direction of the project in a healthy meritocratic process. Arguably, this model is more predictable and representative of everyone’s interest when compared to more capitalistic, vendor-driven models.
  3. Collaborative Improvement and Innovation: Open source projects often benefit from the contributions of a global community of developers and users. This collaborative process can often lead to more innovative solutions and quicker bug fixes than a traditional, closed-source development model. This usually means that the larger an open-source project grows, the more likely its long-term support is assured.
  4. Transparency and Trust: Open-source BI software allows for complete transparency as users can inspect the code to ensure there are no hidden processes that could compromise data. This transparency can result in a higher level of trust in the software, which is particularly important when it comes to data analysis and the decisions made based on this analysis.
  5. Interoperability and Customization: With open-source BI software, businesses have the ability to tailor the software to their specific needs, instead of being constrained by a predefined set of features. Moreover, open-source BI tools generally follow open standards, enhancing interoperability and integration with other systems.
  6. Competitive Landscape: More and more companies are recognizing the benefits of open-source software, including tech giants like Microsoft and Google. The increasing adoption of open-source solutions in business environments and its proven success can lead to a more competitive market, which further encourages innovation and cost-effectiveness.

It's important to note that as BI increasingly moves towards open source, it doesn't mean that commercial offerings will become obsolete. Proprietary software can still live alongside open-source software to fulfill specific/niche needs if the budget supports it. Also, while some organizations lack the technical expertise to implement and maintain open-source software, commercially hosted offerings of open-source offerings (like Preset’s offering of Apache Superset) fulfill that need as well.

Modernization

The landscape of Business Intelligence (BI) is undergoing a significant transformation, moving away from rigid, outdated structures to more flexible and innovative solutions. This modernization is characterized by several key trends:

Breaking the Monoliths

Traditional BI systems often relied on monolithic architectures (e.g. Cognos, MSFT SQL Server, Business Objects, the wider Tableau stack, and others), where various components were tightly coupled and interdependent. Today's trend is towards a more modular approach known as the "modern data stack." This approach leverages specialized tools for extraction, storage, transformation, analysis, and visualization, allowing for greater scalability, adaptability, and task-specific optimization. By breaking down the monoliths, organizations can create customized data infrastructures that are more agile and responsive to changing needs.

Beyond the Desktop Era

Many BI tools were initially designed for the desktop era, and their migration to the cloud has been fraught with challenges. Retrofitting these tools for the cloud often involves complex trade-offs and can miss out on the advancements in frontend technology over the past decade. For instance, Tableau is rumored to have built a complex compatibility layer in WebAssembly to retrofit their desktop application in the browser but approaches like this (similar to Flash or Java applets of yesteryear) have tradeoffs and are possibly missing out on much of the progress seen in the frontend world over the past decade (npm, React, and other advancements/conveniences in frontend-engineering).

Modern cloud-native tools like Apache Superset are built on cutting-edge technologies like React, AntD, Flask, and countless others across the Python and npm ecosystem, offering a set of modern and dynamically evolving building blocks. In contrast, desktop-era tools may struggle with compatibility layers and outdated technologies, hindering their performance and adaptability.

Break Free of Obsolete/Proprietary Cubes and Extracts

Legacy BI tools often relied on proprietary storage and compute layers, such as Tableau Extracts and MicroStrategy cubes. While these layers may have offered some advantages in the past, they are increasingly seen as problematic for several reasons:

  • Data Duplication: Storing data in additional layers adds complexity, delays, and potential points of failure.
  • Performance at Scale: Modern databases like Druid, Clickhouse, and Pinot blow proprietary cubes and extracts out of the water. Managed-cloud solutions like BigQuery and Snowflake also do, so while offering the convenience that comes with cloud and pay-as-you-go solutions.
  • Real-time Challenges: Legacy caching solutions often struggle to keep up with the rise of real-time systems.
  • Live-Mode Limitations: Direct database access has often been treated as an afterthought, with key features disabled or slow to implement.

The obsolescence of these proprietary layers is driving a shift towards more efficient and flexible solutions that leverage the best of modern database technology.

Skill & Talent Availability

As technology evolves, the skills required to manage and utilize BI tools also change. Legacy tools may require expertise that is becoming increasingly rare, potentially leading to talent shortages. Conversely, the rising workforce is often more interested in acquiring skills in up-and-coming tools and solutions. Organizations must recognize this shift and invest in training and development to ensure that their teams are equipped with the skills needed for modern BI tools. Embracing modern solutions not only aligns with the preferences of the current talent pool but also ensures that the organization remains competitive and agile in an ever-changing landscape.

A Clean Slate

Over time, like any system, BI platforms can become cluttered with redundant or outdated data, obsolete reports, and unused dashboards… basically "junk." This clutter can hamper productivity, lead to erroneous insights, and create inefficiencies in decision-making processes. Migrating to a new BI tool presents an opportunity to wipe the slate clean and reevaluate not only the data itself but also the ways in which data is used within an organization.

Declaring bankruptcy on an aging BI tool opens up a number of benefits to an org. Transitioning to a new BI tool allows a team to review, streamline, and optimize their existing data processes. It’s a chance to embrace “data hygiene,” ensuring that only the most relevant, accurate, and up-to-date data is utilized. Last but not least, a new tool can reinvigorate a team's engagement with data, providing an opportunity for training, exploration of new features and capabilities, and fostering a renewed sense of ownership and responsibility for data amongst the team.

The Challenges of BI Tool Migration

Let’s be honest - packing up and moving between BI tools is not a walk in the park. BI tools are notoriously difficult to move away from, not only because they are often deeply integrated into a company's processes and systems, but also due to the intricacies of data visualization which can vary widely between platforms. Let’s delve into the nuances of chart property mapping, data access and semantic layer compatibility, dashboard composition and layout, and the critical role of user education in successful migration. We'll also discuss the limitations of automation in the migration process, highlighting why human expertise remains indispensable. Lastly, we'll explore options for seeking assistance from System Integrators (SIs) or specialized service providers to help navigate this complex transition.

Chart Property Mapping

Acting as the blueprint that lays out how data points correspond to the elements of a chart or a graph, chart properties are an integral part of any BI tool. This process is the translation of data into visual forms – the 'mapping' of data categories onto the axes, colors, sizes, and shapes that you see in a graphical representation. Each BI tool has its own way of interpreting and visualizing data, with unique methodologies and algorithms driving these processes.

These variations can pose a real challenge when migrating between BI tools. You cannot simply import the charts from one platform into another and expect them to function and look the same. Every element of the visualization needs to be remapped according to the conventions and capabilities of the new tool. Moreover, some tools might not support certain visualization techniques at all, requiring a rethink of how best to represent the data. This creates an additional layer of complexity, making the migration process more demanding and meticulous.

Data Access and Semantic Layer Compatibility

A semantic layer is an abstraction that sits between the raw data and the end-users, allowing non-technical users to interact with the data without needing to understand the underlying database structure or write complex SQL queries. It simplifies data access by translating complex data into business terms that users understand.

However, assumptions around data layout can differ drastically across various tools, adding to the complexity of migration. Each BI tool utilizes its own specific semantic layer, such as LookML in Looker, Universe Design Tool in BusinessObjects, or BISM in Power BI, to name a few. These semantic layers often contain proprietary codes or structures that define the relationships between data, metrics, and dimensions. Consequently, what works in one system may not translate well, or at all, to another.

For instance, Apache Superset works on a dataset-centric model that assumes much of the data modeling will be done upstream using tools like dbt or Cube. It provides a thin semantic layer for small transformations and definitions but heavily relies on the integrity of pre-modeled datasets. This approach contrasts with other tools that may assume more flexibility within their own semantic layers.

The migration process must then take into account these different assumptions, ensuring the compatibility of data access and semantic layers between the old and new BI tools. A change in the tool could mean completely redesigning the data model or the manner in which users interact with their data. Therefore, understanding the differences in data layout, access, and the semantic layer is critical to mitigate any complications during a BI tool migration.

Dashboard Composition and Layout

Dashboards are visual interfaces designed with a curated collection of data visualizations in an intuitive, accessible way that helps drive decision-making processes. However, the design and capabilities of dashboards can vary greatly from one tool to another, offering varying levels of customization, interactivity, and control over the dashboard layout. For instance, the placement and interaction of widgets, charts, and key performance indicators (KPIs) might differ, with some tools offering drag-and-drop functionality, while others require more manual input. Moreover, the complexity of designs, from simple grid layouts to more elaborate, free-form designs, can be tool-specific, leading to reconfiguration requirements during migration.

Further adding to this complexity are unique features that some BI tools offer. For example, Apache Superset provides a "Drill By" feature that allows users to explore data layers in more depth directly from the dashboard. It also supports dashboard-level filters, tabs, and cross-filters, enabling users to interact with multiple related datasets simultaneously and create more intricate analysis pathways. And that’s without mentioning the complexities around how individual dashboard filters can be configured and scoped to affect subsets of charts

When migrating to a new tool, these features may not be directly replicable, so adapting existing dashboards to new environments often involves more than a simple one-to-one translation; it may necessitate a complete redesign to accommodate the new tool's structure and capabilities.

User (Re)Education

End-user adoption is inherently dependent on education and training; a critical factor that is often overlooked during the migration process. BI tools, with their complex interfaces and myriad features, have a steep learning curve. Transitioning to a new tool implies that end-users must adapt to a new environment, often significantly different from the one they are accustomed to.

Each BI tool has its unique structure, navigation, and functionalities. Even small variations in interface design or feature layout can disrupt a user's workflow and reduce efficiency until they become familiar with the new system. More significant differences, such as changes in data access, querying capabilities, or visualization techniques, may require extensive retraining. It's not just about learning where buttons or features are, but also about understanding the underlying principles that guide the new tool's operations.

This learning process can take time and resources, slowing down the transition and potentially affecting business operations in the short term, but user training is a crucial investment for long-term success. Without adequate training and education, the benefits of migrating to a more advanced or suitable BI tool may not be fully realized. Ultimately, the real power of a BI tool lies not just in its capabilities, but in how effectively its users can leverage those capabilities.

The Limits of Automation

While automation can indeed play a vital role in the migration process between BI tools, accelerating certain tasks such as data transfer, reformatting, or re-indexing, it can only cover a part of the migration process due to the complex nature of BI and underlying systems. Many aspects of migration, such as chart property mapping, data access, data modeling, and dashboard creation, cannot be fully automated due to the unique nuances and proprietary designs of each BI tool.

That's where human expertise becomes indispensable. Professionals experienced in the specificities of BI systems can navigate these intricacies, ensuring a smooth transition while minimizing data loss or misinterpretation. Moreover, they can provide invaluable input on the customization of the new system to best fit the company's needs and processes.

For businesses seeking assistance, there are many options available. System Integrators (SIs) and specialized service providers offer services to help companies navigate the migration process. These services range from initial consultations and strategy development to hands-on implementation and post-migration support. Preset happens to offer such services.

In this context, Preset offers the Preset CLI tool, a practical solution that allows Superset users to migrate their content off of and onto Preset Workspaces. This tool greatly simplifies the migration process and eliminates the risk of vendor lock-in, providing users with flexibility and control over their data. By blending automated tools with human expertise, organizations can ensure a successful transition to their new BI tool while minimizing disruption to their ongoing operations.

Coming up next...

In our next post, we'll move on to the more practical matter of making the move, once the difficult choices to do so have been made. There’s a lot that goes into this process, but if you plan well, you can make the jump quickly and safely. We'll discuss the details that go into migrating your assets and your users, so that you end up with happy team, and a BI solution that runs like a well-oiled machine. See you next time! [Update: you can now view the post righthere]

Subscribe to our blog updates

Receive a weekly digest of new blog posts

Close