What does Fabric Mean for Me?
What does Fabric Mean for Me?
In our last blog post we explored the functionality and workloads of Fabric, so now the next step is to consider what this means for your organisation and data needs.
A quick recap, Microsoft Fabric is essentially a Software as a Service (SaaS) approach to data platforms and analytics, which is a natural evolution of data platforms from the Platform as Service (PaaS) capabilities Azure offers. This means that the infrastructure comes ready setup, but you must choose what capacitates you’d like to purchase in advance. The key point here is that although some of the lower-level provisioning is done in advance, a strong data strategy and clear governance policies are still required.
Domains
Fabric changes the way we look at data workloads. Previously, the focus sits at different stages of data movement, which required its own set of tools, skills, governance and a data integration strategy. Now the workloads look more logically at the data workload in the context of the organisational job it is being used to complete. This is where Domains play a crucial role, grouping data into the relevant organisational departments such as sales or marketing. The focus is on the business’s objectives and what they require from their data.
A Domain requires admins and contributors to associate workspaces within the Domain, similarly to how they are used currently in Power BI. What this achieves is that a Domain admin can have more control over their workspaces and align these workspaces more granularly to specific teams or projects. So, we have teams working on their own projects with more controlled environments but all from the same data in OneLake. Benefits of this include no more restrictive access barriers with Tenant admins to data and the ability to monitor what data is being used and how. It also creates a better sense of collaboration between teams and encourages them to work together rather than in silos as they are all using the same tools atop of the central lake.
Giving teams access to all data in OneLake may seem daunting, but through correct data governance i.e., certifying and endorsing the correct datasets, will ensure that your whole the organisation is working from the same source of truth.
What does this mean for my Data Teams?
Once again data governance is crucial and so a Data Steward, it’s fundamental to a successful Fabric implementation. It’s likely the role is currently being done in IT Teams by Database Administrators and so just re-formalises a lot of work already being done. With a unified consumption model this role manages the costs of the data estate and the usage across the Fabric tenant of workloads. They also monitor the data sources to your lake and the connections to the users and domains ensuring there is data accuracy.
Data Engineers: your engineers still maintain the capabilities they may be doing in Synapse or Data Factory to prepare data but can now streamline this work by it all being housed on one platform. This reduces costly data replication and data movement while leaving the data in an easily accessible source for Data Scientists and Data Analysts.
Data Scientists: The agility of Fabric allows Data Scientists to simultaneously work in their preferred language, frameworks, and tools such as the easy integration between Spark Notebooks and Azure ML.
Data Analysts: Your analysts will now be able to easily get certified data that isn’t stagnated by dependencies on other tools accelerating the time to insight. Analysis can be rapidly done using SQL based tools for any self-serve data modelling or can be integrated seamlessly with Power BI for even further capability.
Data Citizens: Your end-users can subsequently make data-driven decisions with confidence from certified sources without being overwhelmed by the whole data estate via data access management controls.
How Does Fixed Capacity Work?
With the Fabric pricing model, you need to anticipate the capacity size you’ll expect to use. This can be done monthly or hourly and will be billed at the same cost whether it is used or not. Capacities are assigned to workspaces like they are in Power BI but can be used across the 7 workloads.
Predicting capacity/compute usage, may seem daunting at first, but ensures you have complete cost transparency that can be scaled up, scaled down or switched off at any time while getting optimised performance from OneLake. It is worth noting though the additional costs you will still have from storing data in OneLake which is costed similarly to ADLS Gen 2.
Another interesting benefit of Fabric is that you don’t need to make any technical design choices in advance and are also not fixed to a certain technology. The capacity reserved can be used simultaneously for either Analysis Services, Spark, Kusto DB or Relational DB workloads, making this a very flexible approach.
Reserved capacities also will be coming where you commit to using capacities on annual basis and won’t be able to scale down or turn off your compute but it will come at a cheaper costs.
Benefits:
To summarise some of the benefits that Fabric offers:
- Removes Data Silos: As discussed data now sits centrally in a single platform with all the functionalities required so no need for developers to be working in isolated storage accounts.
- Improved performance and consistency: You can have confidence teams are working from a single source of truth and don’t have to wait on data workflows or refreshes as there is no copying of data between tools.
- Industry-leading security and governance: by using a single source for data you can enforce all compliance and security policies centrally.
- Cost transparent: You know ahead of time what your costs are and removes the worry of having to estimate what your platform costs might be.
- Direct Lake: A Power BI benefit is that you now have the real-time functionality of direct query but the performance of using import mode. This is due to the data storage type of OneLake being parquet making the data much more accessible.
- The focus is now actually on doing analysis: Now instead of having to spend a lot of time planning and orchestrating data integration, businesses can focus on drawing insights from the data.
But is it for me?
Fundamentally, it’s really important to note both Azure and Fabric are perfect for Data & AI, Fabric is the same capabilities packaged differently. If you’re not currently in a position to adopt Fabric that doesn’t mean your data platform can’t achieve your goals and requirements, plus migration paths can be considered in the future.
However, there are some scenarios where Fabric may be better suited to your organisation:
- You require all your costs and capabilities to be known in advance.
- You’re using Power BI but don’t have a central data & analytics platform for your ETL process and want to consolidate this work to a central source of truth.
- You currently don’t have a lot of infrastructure capability or the capacity to be adopting a lot of new platforms.
- You have a large focus on AI and ML projects with want to expedite the time to starting these projects. With the data sitting centrally you can much more easily start exploring the data and using the full platform capabilities of Fabric.
Fabric can also be accessed right now as a free trial, so it’s definitely worth exploring and getting hand-on to better understand how it could work for you.
To determine if Microsoft Fabric is the right choice for you, consider your current data estate tools, team skills, your data maturity, costing model and ultimately business goals.
We can help you to reach a decision about Fabric and if it’s the right fit for your business requirements. Please reach out for a chat by clicking the live chat button at the bottom of this page.