Pricing Archives - Tiger Analytics Thu, 16 Jan 2025 10:28:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.tigeranalytics.com/wp-content/uploads/2023/09/favicon-Tiger-Analytics_-150x150.png Pricing Archives - Tiger Analytics 32 32 A Comprehensive Guide to Pricing and Licensing on Microsoft Fabric https://www.tigeranalytics.com/perspectives/blog/a-comprehensive-guide-to-pricing-and-licensing-on-microsoft-fabric/ Mon, 01 Jul 2024 12:13:32 +0000 https://www.tigeranalytics.com/?post_type=blog&p=22659 This comprehensive guide explores Microsoft Fabric's pricing strategies, including capacity tiers, SKUs, and tenant hierarchy, helping organizations optimize their data management costs. It breaks down the differences between reserved and pay-as-you-go models, explaining Capacity Units (CUs) and providing detailed pricing information. By understanding these pricing intricacies, businesses can make informed decisions to fully leverage their data across various functions, leading to more efficient operations and better customer experiences.

The post A Comprehensive Guide to Pricing and Licensing on Microsoft Fabric appeared first on Tiger Analytics.

]]>
Organizations often face challenges in effectively leveraging data to streamline operations and enhance customer satisfaction. Siloed data, complexities associated with ingesting, processing, and storing data at scale, and limited collaboration across departments can hinder a company’s ability to make informed, data-driven decisions. This can result in missed opportunities, inefficiencies, and suboptimal customer experiences.

Here’s where Microsoft’s new SaaS platform “Microsoft Fabric” can give organizations a much-needed boost. By integrating data across various functions, including data science (DS), data engineering (DE), data analytics (DA), and business intelligence (BI), Microsoft Fabric enables companies to harness the full potential of their data. The goal is to enable seamless sharing of data across the organization while simplifying all the key functions of Data Engineering, Data Science, and Data Analytics to facilitate quicker and better-informed decision-making at scale.

For enterprises looking to utilize Microsoft Fabric’s full capabilities, understanding the platform’s pricing and licensing intricacies is crucial, impacting several key financial aspects of the organization:

1. Reserved vs Pay-as-you-go: Understanding pay-as-you-go versus reserved pricing helps in precise budgeting and can affect both initial and long-term operational costs.
2. Capacity Tiers: Clear knowledge of capacity tiers allows for predictable scaling of operations, facilitating smooth expansions without unexpected costs.
3. Fabric Tenant Hierarchy: It is important to understand the tenant hierarchy as this would have a bearing on the organization’s need to buy capacity based on their unique needs.
4. Existing Power BI Licenses: For customers having existing Power BI, it is important to understand how to utilize existing licenses (free/pro/premium) and how it ties in with Fabric SKU.

At Tiger Analytics, our team of seasoned SMEs have helped clients navigate the intricacies of licensing and pricing models for robust platforms like Microsoft Fabric based on their specific needs.

In this blog, we will provide insights into Microsoft Fabric’s pricing strategies to help organizations make more informed decisions when considering this platform.

Overview of Microsoft Fabric:

Microsoft Fabric offers a unified and simplified cloud SaaS platform designed around the following ‘Experiences’:

  • Data Ingestion – Data Factory
  • Data Engineering – Synapse DE
  • Data Science – Synapse DS
  • Data Warehousing – Synapse DW
  • Real-Time Analytics – Synapse RTA
  • Business Intelligence – Power BI
  • Unified storage – OneLake

A Simplified Pricing Structure

Unlike Azure, where each tool has separate pricing, Microsoft Fabric simplifies this by focusing on two primary cost factors:

1. Compute Capacity: A single compute capacity can support all functionalities concurrently, which can be shared across multiple projects and users without any limitations on the number of workspaces utilizing it. You do not need to select capacities individually for Data Factory, Synapse Data Warehousing, and other Fabric experiences.

2. Storage: Storage costs are separate yet simplified, making choices easier for the end customer.

Microsoft Fabric

Understanding Fabric’s Capacity Structure

To effectively navigate the pricing and licensing of Microsoft Fabric, it is crucial to understand how a Fabric Capacity is associated with Tenant and Workspaces. These three together help organize the resources within an Organization and help manage costs and operational efficiency.

1. Tenant: This represents the highest organizational level within Microsoft Fabric, and is associated with a single Microsoft Entra ID. An organization could also have multiple tenants.

2. Capacity: Under each tenant, there are one or more capacities. These represent pools of compute and storage resources that power the various Microsoft Fabric services. Capacities provide capabilities for workload execution. These are analogous to horsepower for car engines. The more you provision capacity, the more workloads can be run or can be run faster.

3. Workspace: Workspaces are environments where specific projects and workflows are executed. Workspaces are assigned a capacity, which represents the computing resources it can utilize. Multiple workspaces can share the resources of a single capacity, making it a flexible way to manage different projects or departmental needs without the necessity of allocating additional resources for each new project/ department.

Microsoft Fabric
Microsoft Fabric
The figure above portrays the Tenant hierarchy in Fabric and how different organizations can provision capacities based on their requirements.

Understanding Capacity Levels, SKUs, and Pricing in Microsoft Fabric

Microsoft Fabric capacities are defined by a Stock Keeping Unit (SKU) that corresponds to a specific amount of compute power, measured in Capacity Units (CUs). A CU is a unit that quantifies the amount of compute power available.

Capacity Units (CUs) = Compute Power

As shown in the table below, each SKU (Fn) is represented with a CU. E.g. F4 is double in capacity as compared to F2 but is half that of F8.

The breakdown below shows the SKUs available for the West Europe region, showing both Pay As You Go and Reserved (1-year) pricing options:

Microsoft Fabric

Comparative table showing Fabric SKUs, CUs, associated PBI SKU, Pay-as-you-Go and Reserved pricing for a region.
1 CU pay-as-you-price at West EU Region = $0.22/hour
1 CU PAYGO monthly rate calculation: $0.22*730 =$160.6, F2 =$160.6*2=$321.2
1 CU RI monthly rate calculation: Round ($0.22* (1-0.405)*730*12,0)/12=~$95.557…F2 RI = ~$95.557…*2=~$191.11

Pricing Models Explained:

Pay As You Go: This flexible model allows you to pay monthly based on the SKU you select, making it ideal if your workload demands are uncertain. You can purchase more capacity or even upgrade/downgrade your capacity. You further get an option to pause your capacities to save costs.

Reserved (1 year): In this option, you pay reserved prices monthly. The reservation is for 1 year. The prices of reserved can give you a savings of around 40%. It involves no option to pause and is billed monthly regardless of capacity usage.

Storage Costs in Microsoft Fabric (OneLake)

In Microsoft Fabric, compute capacity does not include data storage costs. This means that businesses need to budget separately for storage expenses.

  • Storage costs need to be paid for separately.
  • Storage costs in Fabric (OneLake) are similar to ADLS (Azure Data Lake Storage).
  • BCDR (Business continuity Disaster recovery) charges are also included. This comes into play when Workspaces are deleted but some data needs to be extracted from the same.
  • Beyond this, there are costs for cache storage (for KQL DB)
  • There are also costs for the transfer of data between regions – which is known as Bandwidth pricing. More details are in this link.

Optimizing Resource Use in Microsoft Fabric: Understanding Bursting and Smoothing Techniques

Despite purchasing a capacity, your workload may demand higher resources in between.

For this, Fabric allows two methods to help with faster execution (burst) while flattening the usage over time (smooth) to maintain optimal costs.

  • Bursting: Bursting enables the use of additional compute resources beyond your existing capacity to accelerate workload execution. For instance, if a task normally takes 60 seconds using 64 CUs, bursting can allocate 256 CUs to complete the same task in just 15 seconds.
  • Smoothing: Smoothing is applied automatically in Fabric across all capacities to manage brief spikes in resource usage. This method distributes the compute demand more evenly over time, which helps in avoiding extra costs that could occur with sudden increases in resource use.

Understanding Consumption: Where do your Computation Units (CUs) go?

Microsoft FabricImage credit: Microsoft

The following components in Fabric consume or utilize the CU (Capacity Units)

  • Data Factory Pipelines
  • Data Flow Gen2
  • Synapse Warehouse
  • Spark Compute
  • Event Stream
  • KQL Database
  • OneLake
  • Copilot
  • VNet Data Gateway
  • Data Activator (Reflex)
  • PowerBI

The CU consumption depends on the solution implemented for functionality. Here’s an example for better understanding:

Business Requirement: Ingest data from an on-prem data source and use it for Power BI reporting.

Solution Implemented: Data Factory pipelines with Notebooks to perform DQ checks on the ingested data. PowerBI reports were created pointing to the data in One Lake.

How are CU’s consumed:

CUs would be consumed every time the data factory pipeline executes and further invokes the Notebook (Spark Compute) to perform data quality checks.

Further, CU’s would get consumed whenever the data refreshes on the dashboard.

Microsoft Fabric Pricing Calculator:

Microsoft has streamlined the pricing calculation with its online calculator. By selecting your region, currency, and billing frequency (hourly or monthly), you can quickly view the pay-as-you-go rates for all SKUs. This gives you an immediate estimate of the monthly compute and storage costs for your chosen region. Additionally, links for reserved pricing and bandwidth charges are also available.

For more detailed and specific pricing analysis, Microsoft offers an advanced Fabric SKU Calculator tool through partner organizations.

Understanding Fabric Licensing: Types and Strategic Considerations

Licensing in Microsoft Fabric is essential because it legally permits and enables the use of its services within your organizational framework, ensuring compliance and tailored access to various functionalities. Licensing is distinct from pricing, as licensing outlines the terms and types of access granted, whereas pricing involves the costs associated with these licenses.

There are two types of licensing in Fabric:

  • Capacity-Based Licensing: This licensing model is required for operating Fabric’s services, where Capacity Units (CUs) define the extent of compute resources available to your organization. Different Stock Keeping Units (SKUs) are designed to accommodate varying workload demands, ranging from F2 to F2048. This flexibility allows businesses to scale their operations up or down based on their specific needs.
  • Per-User Licensing: User-based licensing was used in Power BI, and this has not changed in Fabric (for compatibility). The User accounts include:
    • Free
    • Pro
    • Premium Per User (PPU)

Each tailored to specific sets of capabilities as seen in the table below:

Microsoft Fabric
Image Credit: Microsoft (https://learn.microsoft.com/en-us/fabric/enterprise/licenses)

Understanding Licensing Scenarios

To optimally select the right Fabric licensing options and understand how they can be applied in real-world scenarios, it’s helpful to look at specific use cases within an organization. These scenarios highlight the practical benefits of choosing the right license type based on individual and organizational needs.

Scenario 1: When do you merely require a Power BI Pro License?

Consider the case of Sarah, a data analyst whose role involves creating and managing Power BI dashboards used organization-wide. These dashboards are critical for providing the leadership with the data needed to make informed decisions. In such a scenario, a Pro License is best because it allows Sarah to:

  • Create and manage Power BI dashboards within a dedicated workspace.
  • Set sharing permissions to control who can access the dashboards.
  • Enable colleagues to build their visualizations and reports from her Power BI datasets, fostering a collaborative work environment.

In the above scenario, a Pro license would suffice (based on the above-listed requirements.)

Scenario 2: What are the Licensing Options for Small Businesses?*

Consider a small business with about 60 users that wants to leverage premium Power BI features (pls. refer to the comparison table above which shows the capabilities for free, pro, and PPU (premium per user) to enhance its data analysis capabilities. The company has two primary licensing options within Microsoft Fabric to accommodate its needs, each with different cost implications and service access levels.

Option 1: Premium Per User (PPU) Licensing

  • This option involves purchasing a Premium Per User license for each of the 60 users.
  • Cost Calculation: 60 users x $20 per month = $1,200 per month.
  • Note: This option does not include any Fabric services or capacities; it only covers the Power BI Premium features.

Option 2: Combining F4 Capacity with Power BI Pro Licenses

  • Alternatively, the company can opt for a combination of an F4 Fabric capacity and 60 Power BI Pro licenses.
  • Cost Calculation: F4 capacity at $525 per month + (60 Power BI Pro licenses x $10 = $600) = $1,125 per month. Additional storage and other service costs may apply.
  • Benefits: This option is not only more cost-effective compared to Option 1, but it also provides access to broader Fabric services beyond just Power BI, enhancing the organization’s overall data management capabilities.

Option 2 offers a more economical and service-inclusive approach. Furthermore, it opens up opportunities to scale up using higher Fabric capacities with reserved (1-year) pricing for even greater efficiency and cost savings in the long run.

Microsoft Fabric
Table: Fabric SKU and Power BI SKUs for reference calculations and comparisons

Scenario 3: A Medium business organization is looking to implement analytics solutions using Fabric services and reporting using Power BI. They are also looking to share Power BI content for collaborative decision-making. What are the licensing options in Fabric?

Considerations:

1. Since the organization is looking to share Power BI content, you will need Power BI premium or equivalent Fabric capacities (F64 and above)
2. Microsoft is transitioning/enabling Power BI premium capacities to automatically be Fabric capacities – which brings more flexibility for organizations while keeping costs the same (when compared with PPU licenses)
3. It would be wise to start with F64 Pay-As-You-Go initially, check for performance and other factors such as bursting in the monthly bills, and finally decide on the final Fabric capacity with reserved pricing to avail up to 40% savings.

Scenario 4: An organization is looking to use Co-Pilot extensively across the platform. What Fabric capacity can they start with?

Considerations: A minimum of F64 SKU is required to be able to use Co-Pilot.

The table above provides a reference for understanding how different SKUs align with specific user needs and organizational goals, helping to further clarify the most effective licensing strategies for various roles within a company.

Key Considerations for selecting the right Fabric SKU and License

Now that we have seen some practical scenarios related to making licensing decisions, let us list out the key considerations for selecting the optimal Fabric SKU and license:

  • Organization Size & Usage Patterns:
    • A large organization with diverse data needs will likely require a higher-capacity SKU and more user licenses. Consider a mix of per-user and capacity licenses – analyze which teams work heavily in Fabric vs. those who are light consumers.
    • If your organization already uses Power BI extensively, or it’s central to your use of Fabric, having at least one Pro or PPU license is essential.
  • Workload Types and Frequency:
    • Batch vs. real-time processing: One-time bulk data migrations might benefit from short-term bursts, while continuous streaming needs consistent capacity.
    • Complexity of transformations: Resource-intensive tasks like complex data modeling, machine learning, or large-scale Spark jobs will consume more CUs than simple data movement.
    • Frequency of Power BI Use: Frequent dataset refreshes and report queries in Power BI significantly increase compute resource consumption.
    • Content Sharing/ CoPilot usage: To share the Power BI content freely across the organization or in order to use CoPilot, you must be on a minimum F64 or higher SKUs.
  • Operational Time:
    • Pay-as-you-go v/s Reserved (1-year) pricing: Reserved capacity locks in a price for consistent usage, while pay-as-you-go is better for sporadic workloads. The Reserved licensing provides roughly about 40% savings over the Pay-as-you-Go.
    • Pausing: You can pause your existing pay-as-you-go license when the capacity is not in use, resulting in cost savings.
    • Development vs. production: Dev environments can often use lower tiers or be paused when idle to reduce costs.
  • Region:
    • Costs vary by Azure region. Align your Fabric deployment with your primary user location to minimize data egress charges.
  • Power BI Premium: While Power BI licenses have not changed in Fabric, it is important to consider that the Power BI premium license would be merged with Fabric (F) licenses. The Free and Pro licenses would not be impacted.
  • Mixed Use: You may need to consider purchasing both Fabric (capacity) and Power BI licenses for sharing content across the organization.

How to Bring These Factors into Your Planning

Before beginning the Fabric deployment, consider these steps to ensure you choose the right SKU and licensing options:

  • Start with Baselining: Before scaling up, run pilot workloads to understand your capacity unit (CU) consumption patterns. This helps in accurately predicting resource needs and avoiding unexpected costs.
  • Estimate Growth: Project future data volumes, user counts, and evolving analytics needs. This foresight ensures that your chosen capacity can handle future demands without frequent upgrades.
  • Right-size, Don’t Overprovision: Initially, select an SKU that slightly exceeds your current needs. Microsoft Fabric’s flexibility allows you to scale up as necessary, preventing unnecessary spending on excess capacity.
  • Regularly Monitor Usage: Utilize the Capacity Metrics App to track resource usage and identify trends. This ongoing monitoring allows for timely adjustments and optimization of your resource allocation, ensuring cost-effectiveness.

Power BI Capacity Metrics App: Your Cost Control Center in Fabric

The Power BI Capacity Metrics App is an essential tool for understanding how different Microsoft Fabric components consume resources. It provides

  • Detailed reports and visualizations on the usage of computing and storage.
  • Empowers you to identify cost trends, potential overages, and optimization opportunities.
  • Helps you to stay within your budget.

Microsoft Fabric

Microsoft Fabric has streamlined licensing and pricing options, offering significant benefits at both capacity and storage levels:

Capacity Benefits
Microsoft Fabric
Image credits: Microsoft

Storage Benefits
Microsoft Fabric

In this blog, we’ve explored the intricacies of Microsoft Fabric’s pricing and licensing, along with practical considerations for making informed purchase decisions. If you want to integrate Fabric into your business, you can purchase the capacities and licenses from Azure Portal or reach out to us in case you need to discuss your use case.

The post A Comprehensive Guide to Pricing and Licensing on Microsoft Fabric appeared first on Tiger Analytics.

]]>
Rethinking Insurance Dynamics in a Changing World https://www.tigeranalytics.com/perspectives/blog/transformations-insurance-value-chain/ https://www.tigeranalytics.com/perspectives/blog/transformations-insurance-value-chain/#comments Tue, 05 Oct 2021 14:16:22 +0000 https://www.tigeranalytics.com/?p=5761 The insurance industry grapples with disruptive forces – Insurtech, climate change, and the COVID pandemic necessitate digitalization and dynamic underwriting. Loss prevention now drives innovation, redefining insurers as proactive partners. The future hinges on a data-driven approach, driving industry evolution beyond financial protection.

The post Rethinking Insurance Dynamics in a Changing World appeared first on Tiger Analytics.

]]>
The ‘new normal’ in Insurance

Come 2021, and the three major factors that are challenging traditional insurance value chain are Insurtech, climate change, and the ongoing COVID pandemic. Amid lockdowns, floods, forest fires, and a digital-first world, the industry is having its watershed moment (we briefly touched upon this in our earlier blog). On the one hand, risks that need to be insured are becoming complex. On the other hand, data and technology are enabling insurers to better understand and influence customer behavior.

Given the disruptors are here to stay, insurers are being forced to evolve and be creative every step of the way. Insurance companies now have the opportunity to become something much more than just financial protection providers. A deep dive into the challenges within the insurance value chain sheds light on the innovations being made to not only assess the risks to be covered, but also to mitigate that risk for the customer and the insurer.

Product Design and Distribution

In the current environment, products such as home or property insurance must be designed to mitigate the long-term effects of climate change. Companies, for example, are launching efforts to help clients strengthen their insured assets against extreme weather conditions. Offering climate-proofing of homes and catastrophe-specific coverage for customers living in a risk-prone area are examples of ways in which insurers are moving towards offering more sustainable policies. Adopting geospatial solutions can play a vital role in helping insurers understand environmental risks with precision.

The pandemic has also, undoubtedly, increased the interest in life and health insurance. This has forced insurers to design products that ensure long-term financial and health benefits, all while following a digital-first module. Here, incentives such as virtual gym memberships or a free teleconsultation with a nutritionist integrated into the policy, are being used to attract customers.

Usage-based insurance products are also growing in popularity in 2021, wherein premiums are payable based on the extent to which an activity is performed. The best real-world example of this is motor insurance, where a customer is charged based on the number of miles they drive, rather than paying fixed premiums over a certain period of time. This confronts the COVID and climate conundrum all at once — lockdowns have forced customers to stay indoors and use their cars less, and insurance companies are able to promote themselves as being sustainable, adjustable, and allies in the war against rising carbon emissions.

While the demand for insurance is going up, the pandemic has imposed restrictions on traditional distribution channels for insurance. Insurers and agents, who were accustomed to in-person interactions, had to quickly adopt digital tools such as video chats, chatbots, and self-service websites to sell insurance. The industry is also moving away from captive agents to independent agents and digital insurance exchanges are accelerating this trend. Platforms like Semsee, Bolttech, Bold Penguin, and Uncharted pull data from many insurers, allowing agents to see multiple quotes for policies, similar to how travel agents see competing airfares.

Pricing and Underwriting

Historic data, which forms the basis of pricing and underwriting, needs to be re-examined in the COVID era, and beyond. With the threat of future pandemics and increasing climate change-related disasters, assessing the risk level of customers, whose needs are getting more varied, is going to get more challenging.

One of the ways insurers are tackling this is by moving to a continuous underwriting model from a one-time pricing model. This involves using regularly updated or real-time policyholder data to continuously assess the risk and update the policy terms and premiums accordingly. In addition to providing a better estimation of risk, it helps insurers to adapt to evolving customer needs and influences customers to reduce any risky behavior.

The pandemic is also forcing companies to reduce the physical interactions needed for underwriting. Take life insurance, for example. Given the constraints on in-person medical tests, concepts such as ‘fluidless (no lab tests) processing’ are gaining traction.

Loss and Claims Management

The initial innovations in these last steps of the insurance value chain were primarily focused on post-loss scenarios with the objective of increasing the efficiency of the claim assessment process. Across property and auto insurance, image analytics enables insurance companies to remotely and more effectively assess the loss and identify possible fraudulent claims.

However, the focus is now shifting to loss prevention. Across all types of insurance products, the aim is to track risky behavior and intervene at the right moment to prevent a loss event. This strategy has existed in Health insurance for a while where insurers have tried to ensure medication adherence and a healthy lifestyle through data-driven interventions. Now, thanks to IoT, it is finding its place across other areas like property insurance (detecting water leak or low pressure in a water sprinkler), workers compensations (identifying workers without adequate equipment or unsafe lifting by an employee), and auto insurance (sensing erratic driving behavior). An interesting example from auto insurance is the BlaBlaCar Coach, an app-based service that comes with certain car insurance products and offers drivers personalized tips for safer driving.

This focus on loss prevention is also having an impact on other parts of the insurance value chain like product design and pricing. It is allowing insurers to provide expanded coverage at affordable rates, even for previously uninsurable risks.

What the Future Holds

Risk analysis and crisis aversion were always at the core of the insurance industry. However, what the industry was not prepared for, much like the rest of us, was a global pandemic and coming face-to-face with the effects of climate change. On the other hand, these disruptions may have proven to be the catalyst for innovation in the industry, which was long overdue. One thing is clear – a successful transformation cannot be achieved by technology alone. A data-driven approach will be crucial in effectively leveraging this technology.

Stay tuned for more details on the role of data and how innovative data science solutions are driving value for both insurance companies and their customers.

The post Rethinking Insurance Dynamics in a Changing World appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/transformations-insurance-value-chain/feed/ 311
CPG Analytics of Today: What Are the Top 5 Current Priorities? https://www.tigeranalytics.com/perspectives/blog/cpg-analytics-of-today-what-are-the-top-5-current-priorities/ Thu, 07 Mar 2019 17:29:39 +0000 https://www.tigeranalytics.com/blog/cpg-analytics-of-today-what-are-the-top-5-current-priorities/ Explore the challenges and opportunities faced by the CPG industry through key trends like demographic shifts, e-commerce growth, and analytics-driven decision-making. Unravel critical areas for CPG manufacturers, such as e-commerce, trade spend effectiveness, pricing, marketing effectiveness, and more.

The post CPG Analytics of Today: What Are the Top 5 Current Priorities? appeared first on Tiger Analytics.

]]>
The CPG industry today is in the midst of an interesting bundle of challenges and opportunities. While rapidly changing demographics and consumption choices in most markets and the associated complexity across the value chain is a challenge, the rise of eCommerce (despite what might look like a threat from an “A” player) and the growing adoption of analytics-driven decision making by business teams are clear opportunities.

In our view, smart use of data and CPG analytics – ranging from relatively simple exploratory data analysis to advanced DS/ML/AI models that deliver actionable insights in five key areas can help CPG manufacturers improve their growth and competitiveness: E-commerce, Trade Spend Effectiveness, Pricing, Marketing Effectiveness, and Responsive Manufacturing & Supply Chain.

In a series of posts over the next few months, starting with this overview, CPG industry practitioners from Tiger Analytics will talk about current best practices in data management and analytics around these five areas:

1. E-commerce – For most CPG manufacturers, a significant part of revenue growth (if not all) in the next few years is expected to stronge from e-commerce. Surprisingly, this is also a channel that is not very well covered strongy the traditional data providers. Pioneers in the industry are adopting unique approaches to acquire, integrate & analyze e-commerce data that puts them ahead of the game. If you are wondering about Direct-to-Consumer (DTC), see point 4 below.

2. Trade Spend Effectiveness – Various studies conducted over time show that return on trade spend, which is either the first or second highest spend often running into billions of dollars, is negative for most of the CPG manufacturers in the industry. While robust trade promotions management (operational aspects of defining promotional calendar, promo execution & reimbursements) is in place across the board, analyzing the effectiveness of spend and establishing ‘true incrementality’ of trade spends is an area of big concern. Here again, CPGs that win are going beyond toolsets offering a siloed view of incremental volumes (i.e. just one retailer at a time) to get a full view of effectiveness based on comprehensive data and advanced analytics.

3. Pricing – While promotions look at the effect of temporary trade events, getting everyday prices right is equally important, if not more, since a significant portion of CPG manufacturer trade spend goes into maintaining long-term price gaps warranted by competition, or by retail partner demands. While most CPG manufacturers make pricing decisions at an item/item group X channel level, winning CPGs are making these decisions at a much more granular level, here again taking advantage of data & analytics.

4. Marketing Effectiveness – Marketing spend, which is a close counterpart of trade spend is another top spend item on a CPG manufacturer’s P&L. Two key sources of data & insights have been found to improve effectiveness:

  • Consumer insights generated using data from consumer panels, shopper cards, social media, brand page user registrations, and other Direct-To-Consumer* initiatives help get a clear understanding of consumer preferences and decision hierarchies

  • Insights around traditional and digital media channels (in the form of transparent media mix models) and their impact on short and long-term brand objectives

Of all areas, this is probably the most investment intensive from a data & analytic infrastructure perspective when CPG manufacturers want to go beyond a siloed view (often lamented as coming from agency delivered black-box models, or from the ‘walled gardens’ of the digital world). However, the pay-offs for initiatives are worth the effort, especially for larger brands with higher marketing spends to deliver returns on.

5. Responsive Manufacturing & Supply Chain – compared to other areas, manufacturing & supply chain functions of CPGs have the need to plan over multiple time horizons, across all of which data and analytics play an important role.

  • Near-term (up-to 3months): analyzing out-of-stock and fill rate for maintaining retail partner service levels, and impact of transportation partner & lane decisions are relevant in this time horizon.
  • Mid-term (>3 to 12/18months): understanding impact of promotion plans on shipment volumes over and above the baseline demand picture provided by the Demand Planning function is important in this timeframe, to plan and adjust mid-range production schedules without much of incremental capacity.
  • Long-term (>12/18months – esp. for manufacturing plant and warehouse capacity planning): SKU rationalization – a touchy topic becomes feasible to address in this timeframe due to the amount of internal and external change management it takes to move on this, more than the time taken for analytics.

Interestingly, this is also a space that is witnessing a higher use of Robotics (automated inventory management and warehousing).

While it may appear addressing all the five areas may require significant upfront investments in integrated data environments (DSRs), our experience shows having a clear focus on leveraging analytics to generate specific, actionable insights could help realize significant business value immediately, even on current state data & analytics environments – wherever in the maturity curve they are.

Stay tuned to hear more from us in this series.

Note* Direct-to-consumer initiatives of most CPG manufacturers are often more valuable as a source of augmenting rich consumer insights than being a significant channel of sales volume. Exceptions being brands with significant direct-to-consumer sales, such as lifestyle/sportswear brands and health-beauty-cosmetics manufacturers extending their consumer reach through their own salons.

The post CPG Analytics of Today: What Are the Top 5 Current Priorities? appeared first on Tiger Analytics.

]]>