AI Archives - Tiger Analytics Mon, 21 Apr 2025 12:59:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.tigeranalytics.com/wp-content/uploads/2023/09/favicon-Tiger-Analytics_-150x150.png AI Archives - Tiger Analytics 32 32 AI in Beauty: Decoding Customer Preferences with the Power of Affinity Embeddings https://www.tigeranalytics.com/perspectives/blog/ai-in-beauty-decoding-customer-preferences-with-the-power-of-affinity-embeddings/ Fri, 11 Apr 2025 10:48:21 +0000 https://www.tigeranalytics.com/?post_type=blog&p=24546 The beauty industry is leveraging AI and machine learning to transform how brands understand and serve customers. This blog explores how Customer Product Affinity Embedding is revolutionizing personalized shopping experiences by mapping customer preferences to product characteristics. By integrating data from multiple touchpoints — purchase history, social media, and more — this approach enables hyper-personalized recommendations, smart substitutions, and targeted campaigns.

The post AI in Beauty: Decoding Customer Preferences with the Power of Affinity Embeddings appeared first on Tiger Analytics.

]]>
Picture this: The data engineering team at a leading retail chain is tasked with integrating customer data from every touchpoint — purchase histories, website clicks, and social media interactions — to create personalized shopping experiences. The goal? To leverage this data for everything from predictive product recommendations to dynamic pricing strategies and targeted marketing campaigns. But the challenge isn’t just in collecting this data; it’s in understanding how to embed it across multiple customer interactions seamlessly while ensuring compliance with privacy regulations and safeguarding customer trust.

The beauty industry today is embracing cutting-edge technology to stay ahead of microtrends and streamline product development, all while improving efficiency and innovation in an increasingly fast-paced market. Brand loyalty is no longer solely dictated by legacy brands; “digital-first” challengers are capitalizing on changing consumer preferences and behaviors. Global Industry Cosmetics Magazine found that 6.2% of beauty sales now come from social selling platforms, with TikTok alone capturing 2.6% of the market. Nearly 41% of all beauty and personal care product sales now happen online, according to NielsenIQ’s Global State of Beauty 2025 report.

Virtual try-on apps, AI/ML-based product recommendations, smart applicators for hair and skin products – the beauty industry is testing, scaling, and rapidly deploying solutions to satisfy consumers who demand personalized experiences that cater to their unique needs. Based on our observations, and conversations with leaders in beauty & cosmetics retailing, we found that to thrive in this dynamic landscape, there is a need to move beyond traditional customer segmentation and delve deeper into customer product affinity.

What is customer product affinity embedding?

Imagine a complex map where customers and products are not locations but points in a multidimensional space. Customer product affinity embedding uses advanced machine learning algorithms to analyze vast amounts of data – everything from purchase history and browsing behavior to customer reviews and social media interactions. Upon processing this data, the algorithms create a map where customers (those who have opted-in, and anonymized, of course) and products are positioned based on the strength of their relationship, with proximity reflecting the degree of relevance, preference, and engagement between them. In short, it helps capture the essence of customer preferences in a mathematical representation.

This approach provides businesses with a deeper understanding of customer-product affinities. At Tiger Analytics, we partnered with a leading beauty retailer to design a system that captures the true essence of customer preferences by focusing on both customer and product nuances. It begins with a harmonized product taxonomy and sanitized product attributes, and incorporates curated customer data from transactions, interactions, and browsing behavior. Together, these elements create an accurate and comprehensive view of customer affinities, allowing businesses to tailor strategies with greater precision.

How does customer product affinity embedding transform business decisions?

Customer product affinity embedding enhances decision-making by capturing the multidimensional interactions between business efforts, customer activities, product characteristics, and broader macroeconomic conditions. Unlike conventional machine learning approaches, which typically focus on solving one problem at a time and require custom feature engineering, this method integrates diverse business signals. Traditional approaches often isolate and aggregate these signals, but they fail to explain the overall variance or underlying business causality, limiting their effectiveness.

Customer-affinity-embedding

Caption: Matrix factorization foundation for affinity matrix

By incorporating deeper insights into customer preferences and behaviors into business strategy, beauty retailers can unlock greater efficiency, relevance, and personalization across various touchpoints. Below are a few ways customer affinity embedding can bring tangible advantages:

  • Hyper-Personalized Recommendations: Take, for instance, suggesting a hydrating toner to someone with a high affinity for high-coverage foundations, thereby providing them with relevant products that match their needs and preferences.
  • Smart Product Substitutions: Sarah, a regular purchaser of the ‘Sunset Shimmer’ eyeshadow palette, gets a notification suggesting a substitute — ‘Ocean Breeze’, a cool-toned palette with hues she may also enjoy — when her favorite product is out of stock.
  • Store Inventory Optimizations: Optimizing inventory levels by predicting demand based on customer affinity. Businesses can avoid stockouts for high-affinity products and minimize dead stock for low-affinity ones, leading to reduced costs and improved customer satisfaction.
  • Personalized Search: Traditional search relies on keywords and filters. However, these methods often miss the nuances of the customer’s intent. For example, a search for “foundation” might be someone seeking full coverage, or someone wanting a lightweight, dewy finish. Affinity embedding helps bridge this gap, ensuring more relevant search results.
  • Targeted Marketing Campaigns: Consider targeting millennials with a strong affinity for Korean beauty with social media campaigns showcasing the latest K-beauty trends.
  • Data-Driven Product Development: If a significant customer segment shows a high affinity for vegan beauty products, but limited options are available, the brand can proactively develop a high-quality vegan makeup line to fill that gap in the market.
  • Personalized Buying Journey: Picture a customer searching for false eyelashes on the app, and then being recommended complementary items like glue and party essentials. Additionally, the system can suggest popular shades previously chosen by customers with similar preferences, creating a seamless and personalized shopping experience.

These are just a few examples of how customer affinity embedding can enhance customer engagement and improve the overall shopping experience. Other use cases, such as Trip Mission Basket Builder, Dynamic Pricing/Discounting, and Subscription Box Optimization further demonstrate how this technology can revolutionize customer satisfaction and business efficiency.

Real-world impact of customer affinity embedding on sales and engagement

Customer affinity embedding is a multi-step process that converts customer data points into a mathematical representation that captures the strength of a customer’s relationship with various products.

Customer-affinity-embedding

Caption: Functional architecture

The same embedding features can be transformed into affinity ranks, which serve as inputs for downstream ML models to generate personalized recommendations and provide insights such as:

  • Product Similarity
  • Customer Similarity
  • Customer Affinity to Products
  • Product Substitutions

Through our collaboration, the beauty retailer experienced a 4.5% increase in repeat purchases over a 12-month period. Additionally, the brand saw a 3.5% average boost in customer engagement scores within the fashion category, and a 7.8% rise in app usage. The company’s ROI for marketing campaigns also improved, with a 23-basis-point increase across digital channels.

Today, the real question isn’t just ‘what does the customer want?’ – it’s ‘how can we truly understand and deliver it?’

Understanding customer needs isn’t just about analyzing past behaviors, but rather predicting intent and adapting in real time. Customers don’t always explicitly state their preferences. Their choices are shaped by trends, context, and discovery. The challenge for brands is to move from reactive insights to proactive personalization, ensuring that every recommendation, search result, and marketing touchpoint feels intuitive rather than intrusive.

Customer product affinity embedding brings brands closer to the customer by placing the consumer at the heart of every decision. With data-driven customer understanding, brands can build deeper and more personalized connections, driving loyalty and growth.

References:

https://shop.nielseniq.com/product/global-state-of-beauty-2025/
https://www.gcimagazine.com/brands-products/skin-care/news/22916897/2024s-global-beauty-sales-are-powered-by-an-ecommerce-social-selling-boom/

The post AI in Beauty: Decoding Customer Preferences with the Power of Affinity Embeddings appeared first on Tiger Analytics.

]]>
How can AI Help Luxury Retailers Unlock a Premium Customer Experience? https://www.tigeranalytics.com/perspectives/blog/how-can-ai-help-luxury-retailers-unlock-a-premium-customer-experience/ Mon, 07 Apr 2025 09:07:21 +0000 https://www.tigeranalytics.com/?post_type=blog&p=24519 The luxury goods market is evolving fast, with customers demanding more personalized, seamless experiences. To keep up, luxury brands are turning to AI to enhance everything from clienteling to product recommendations and customer journey tracking. By harnessing data, AI helps retailers build deeper connections with customers, boost engagement, and increase sales. Read on to know how AI is shaping the future of luxury retail.

The post How can AI Help Luxury Retailers Unlock a Premium Customer Experience? appeared first on Tiger Analytics.

]]>
The definition of luxury is changing. Today, experiences matter as much as products, research found. As a result, luxury goods retailers must go above and beyond to meet sky-high client expectations and capture spending. While analysts forecast a slowdown in 2025 compared to 2024, the global luxury goods market is still projected to grow from $474 billion in 2024 to $578 billion by 2029.

Success in this demanding market is no easy feat, with changing customer demographics and the requirement for relevant and seamless brand interactions. In addition, the luxury segment, which historically focused on a select range of products for a highly specific audience, has expanded with more categories like sunglasses and fragrances to tap into diverse consumer groups. Retailers are now focused on striking a balance between catering to a diverse clientele while still maintaining their exclusivity and brand prestige.

Using AI to better cater to the needs of luxury customers

Retailers possess a treasure trove of data collected across various customer touchpoints – from offline visits, in-store behavior, offline transactions, and loyalty programs, to digital campaign interactions, online visits, clickstream data, online behavior, online transactions, and more. Integrating AI and ML with this data results in valuable and actionable insights that can help serve customers better. At the same time, it is essential that brands handle their expanse of data responsibly, ensuring compliance with privacy regulations and fostering greater trust among customers. Based on our work, we identified three major areas where AI can aid luxury retailers in improving customer service:

  • Clienteling – An approach employed by retail sales associates to build long-term relationships with customers. Luxury retailers can use AI to map the right associate with the customer, and provide the associates with real-time customer data to improve one-on-one experiences.
  • Personalization – Tailoring the customer experience across every touchpoint based on their preferences. By leveraging AI, retailers can offer bespoke services such as crafting highly individualized digital content, and offering personalized recommendations.
  • Customer Journey – Identifying customer growth opportunities by understanding their experience with the brand and their journey. Using AI, luxury brands can quickly and accurately identify and nurture elite customers, and uncover more growth opportunities.

AI-for-luxury-retail

We now explore each approach in detail.

Clienteling: Fostering deeper engagement with the right customer-sales associate match

Luxury retail is more than just a transaction – it is about curating premier experiences and forming lasting bonds beyond a sale. Sales associates play a critical role in creating these experiences and bonds, particularly in an in-store setup, which presents the full brand identity in all its glory. Sales associates personally guide customers through the brand experience with a curated selection based on preferences, engaging them with personalized product recommendations, providing exclusive access to events or limited-edition products, and maintaining relationships throughout the customer journey.

Similar to how a product’s success depends on positioning it to the right market, the success of clienteling efforts hinges on matching the right sales associate to the customer. Matching a large customer base with the right sales associates, scaling this across different geographies, and continuously updating it based on the customer journey is a complex task.

We partnered with a luxury retailer to create such perfect matches with AI and data analytics. We first consolidated customer-level metrics such as preferred categories, brands, average basket size, average transaction amount, frequency of purchases, number of events attended, invites accepted or rejected etc. Next, we gathered sales associate-related metrics like tenure, experience, repeat customers, and customer feedback (CSAT surveys, post-purchase surveys). We built an AI model to predict the probability of successful pairings resulting in conversions. The business team can further fine-tune these results with incremental rules, such as limiting the number of customers each sales associate handles to manage workloads, or ensuring a minimum number of customers allocated to each sales associate etc., to arrive at the final customer-sales associate pairs. The models must be periodically refreshed based on different scenarios – when the customer moves to a different segment, sales associates exit the system, or when a low level of engagement is observed with a pairing.

The AI-driven models we developed streamlined the customers-sales associates pairing and improved clienteling efforts, resulting in a 50% increase in customer spends and a 100% uptick in engagement.

Personalization: Delivering tailored product recommendations and promotions

Customers interact with brands across channels, and retailers are expected to provide seamless and personalized experiences across all touchpoints. Whether it’s ads, emails, website content, or product recommendations, retailers must master the art of personalization. A study by McKinsey found that companies that grow faster drive 40% more of their revenue from personalization than their slower-growing counterparts. Let us look at how AI can help in personalizing products to recommend and promotions to offer below.

Personalized Product Recommendation Solution

A one-size-fits-all strategy to recommend best-selling products to customers may be necessary when retailers lack user information. However, organizations can make more informed and tailored product recommendations as they gather more details about customers — such as demographics, visit patterns (whether in-store or online), purchase history, and product preferences, including attributes like brand, category, hazard level, and sustainability.

We collaborated with a global beauty retailer on a product recommendation solution that leverages deep learning and graph-based techniques to effectively create embeddings that identify products relevant to individual customers. The identified list of products is further refined according to personal preferences, such as brand and style choices, and search and click history, resulting in a tailored list of offerings that is highly relevant to the individual customer, ultimately improving sales and customer engagement.

Personalized Coupon Selection Solution

Preference towards coupon offers can be very, very different from one customer to another. Some may prefer a flat discount rate, some a few dollars off, and others free shipping. It could also differ based on the product category or brand. The redemption rate for digital coupons is generally low, around 7%, reports found. Hence, it is important to offer the right coupon which will encourage the customer towards redemption.

Our approach here would involve leveraging machine learning to build predictive models. These models would predict the probability of redemption across various promotion combinations, such as low spend with high discount, low spend with low discount, high spend with high discount, high spend with low discount, flat dollars off, and add-on services. Analyzing these probabilities across different customer segments would then help identify the most preferred promotion for each customer. Combine this result with additional details specific to each customer, including preferred brands, recently visited/enquired items, search and click history, loyalty tier, etc, to arrive at a personalized coupon. This approach can help increase redemption rate and drive engagement.

Customer Journey: Identifying high-value clients and unlocking growth opportunities

A study analyzing the global luxury goods market found that the top 2% of customers account for nearly 40% of sales. These elite customers, often referred to as Very Important Customers (VICs), have exceptionally high expectations and a low tolerance for anything less than stellar service. While meeting their demands poses a significant challenge, identifying potential VICs before they reach that status is equally complex. When such VICs are discovered early and nurtured properly, they can yield significant returns. We partnered with a global luxury retailer to explore how AI and data can make this process easier.

Using customer segmentation techniques, retailers can identify groups exhibiting similar purchasing behaviors. This process typically begins with an analysis of the recency, frequency, and monetary value of transactions. We can further refine these groups into micro-segments based on demographics, engagement levels, interests, etc. These methods provide valuable insights into customer behavior. Another vital source of information is the Customer Journey itself. This journey includes the sequence of interactions customers have with brands, sales associates, marketing campaigns, store visits, purchases, etc.

Retailers can identify customers most likely to become VICs by building propensity models with features specific to their initial transactions. They can then differentiate VICs from other customers by examining factors like purchasing power (e.g., deviation between spend value and average spend >5X, changes in order value), purchase interest (e.g., preferred brands, product scarcity), engagement level (e.g., time between sales associate interactions and purchases, duration of engagement), and demographics. Assigning these potential VICs to top SAs and creating personalized journey programs — tailored outreach channels, exclusive event invitations, product recommendations — helps retailers recognize and nurture these valuable relationships early.

Customer Journey details can also be used to better identify cross-sell and upsell opportunities. We collaborated with a global luxury retailer to explore this opportunity. Techniques such as Dynamic Time Warping (DTW) can help recognize customers who follow similar purchasing paths. Upon comparing these patterns with the ideal customer profile, retailers can identify products that are suitable for upselling and cross-selling.

The performance of these models can be improved by adding constraints on the training data such as customers who have been part of the system for a certain period of time, made a minimum number of transactions, and purchased across brands/categories/sub-categories. The recommended products can also be refined further based on historical purchasing trends and favored brands. The retailer we partnered with saw a 2X increase in average revenue generated per customer using this cross-sell and upsell solution.

The luxury goods market is undergoing significant transformation as it adapts to changing customer expectations and demographics. By leveraging AI and data analytics, luxury retailers have the opportunity to exceed these expectations, offering personalized and seamless experiences that foster loyalty and drive growth. From refining clienteling practices to personalizing product recommendations and understanding the customer journey, AI can empower retailers to create more meaningful, long-lasting relationships with their customers. By embracing these technological advancements, luxury brands can navigate the complexities of this evolving market, turning challenges into opportunities and ensuring they remain at the forefront of customer satisfaction and operational efficiency.

References:

https://www.expertscoop.com/2020/05/paper-vs-digital-coupons.html
https://cdn.luxe.digital/download/Altagamma-Bain-Worldwide-Luxury-Market-Monitor-2022-luxe-digital.pdf
https://www.statista.com/outlook/cmo/luxury-goods/worldwide
https://www.mckinsey.com/industries/retail/our-insights/state-of-luxury

The post How can AI Help Luxury Retailers Unlock a Premium Customer Experience? appeared first on Tiger Analytics.

]]>
Beyond Bargains: 7 Powerful Ways Retail Leaders Can Use Generative AI to Level Up Their Retail Value Cycle https://www.tigeranalytics.com/perspectives/blog/beyond-bargains-7-powerful-ways-retail-leaders-can-use-generative-ai-to-level-up-their-retail-value-cycle/ Thu, 25 Jan 2024 12:00:59 +0000 https://www.tigeranalytics.com/?post_type=blog&p=19996 From elevating their retail strategy by maintaining uniform product descriptions, enhancing customer support with autonomous agents , developing virtual shopping assistants, simulating precise inventory data, tailoring personalized promotions, and more. Here’s how Retail players can leverage Generative AI all year round, for a higher return on investment.

The post Beyond Bargains: 7 Powerful Ways Retail Leaders Can Use Generative AI to Level Up Their Retail Value Cycle appeared first on Tiger Analytics.

]]>
Retail experts are in enthusiastic agreement, that the outlook is optimistic for Generative AI. Accenture’s Technology Vision 2023 research found that 96% of retail executives are saying they’re extremely inspired by the new capabilities offered by foundation models.

The scope for Generative AI to transform the retail value chain goes beyond forecasting and managing customer demand during major shopping seasons – although those are significant milestones in every organization’s retail calendar. Its real potential lies in tapping into generative capabilities reshaping the entire customer journey.

From sales records to customer preferences, retail brands are data goldmines. By fusing foundational language models with this wealth of information, retailers can harness Generative AI to craft personalized shopping experiences or improve business processes like never before:

  • Customer support and assistants through improved LLM-based chatbots
  • Intelligent search and summarization for inquiries and sales
  • Consistent product descriptions generated through AI
  • Synthetic inventory data generation to simulate supply chains
  • Streamline the process of product development
  • Label generation enhanced accuracy
  • Personalized promotions through text and image generation

Building and operationalizing a bespoke solution using foundational AI models requires several components and enablers to be successful. Components for prompt engineering, dialog management, and information indexing are necessary to extract the best out of an LLM. When coupled with various NLP Accelerators such as document parsing, speech-to-text, and text embedding an end-to-end solution can be developed and deployed.

At Tiger Analytics, we’ve worked with various retail clients to elevate retail CX and work productivity and CRM with an AI/ML-driven customer data garden, streamlining and automating targeting models. Here are our observations

Streamlining Product Descriptions for Better Consistency and Cost Savings

Writing product descriptions for the entire catalog of products is a time-intensive and manual activity for most retailers. Add to this, the variations in consistency in terms of tone of writing style across different departments and countries make this a difficult problem to solve. Retailers need to ensure that the descriptions are relevant and concise to facilitate more conversions. They also need to keep the writing consistent across their e-commerce portals, campaigns, and digital content.

Generative AI can make this process smoother while being more cost-efficient. Customized LLMs can be trained to generate automated descriptions based on product attributes and specifications. Content can be standardized to the company’s style and tone for use across media. For retailers with an evolving product portfolio, this becomes a more scalable way to write and maintain product descriptions. Such a solution can be developed by fine-tuning a foundational LLM such as GPT, T5, or BART with annotated product data, product catalogs, and relevant SEO keywords. By incorporating human feedback, the descriptions can be further tailored to specific styles and needs.

Illustration for Consistent Product Description Solution

Customer Support with Better Understanding and Efficiency

The biggest problem with chatbots before LLMs was that they could not converse in natural language. This led to frustrating experiences for users who would eventually give up on using the bot. As many of the bots in the past were not well-linked with human agents, it led to low customer satisfaction and churn.

LLMs are a perfect solution to this problem. Their strength lies in generating natural language conversational text. They are also good at summarizing vast amounts of text into concise and understandable content. To develop a customer assist solution that works, retailers can deploy LLMs in key parts of the process:

  1. Converting user speech to text
  2. Summarizing the user query
  3. Relaying summarized information to the user
  4. Helping support agents query large amounts of information and generate concise responses.

LLMs need to be used in conjunction with components such as dialog management and work on top of issues, orders, and product data to deliver contextual responses to user queries. Due to the advanced context retention capabilities of LLMs, conversations can naturally progress with continuity, allowing for in-depth dialogue over an extended interaction and the context of the user’s query can be inferred clearly. This enhances customer interaction dramatically and can make the entire support process both effective and cost-efficient.

Illustration for Customer Assist Solution

Enhanced Sales and Customer Engagement with Virtual Shopping Assistant

Generative AI has the potential to personalize the customer journey across various touchpoints, creating a seamless and engaging experience. Imagine a shopper browsing through an online store, encountering suggestions that not only match their preferences but also anticipate their desires. The assistant doesn’t merely suggest; it understands, learns, and grows with the customer. By leveraging cross-category targeting and Next Best Action (NBA) strategies for existing customers, the assistant becomes a companion in the shopping adventure, guiding with insight and relevance.

Illustration for Virtual Shopping Assistant Solution

Beyond mere navigation and suggestions, the Virtual Shopping Assistant can also be leveraged as a smart chatbot to answer any product-related questions while browsing the website. To bring this vision to life, Generative AI can be customized and fine-tuned using detailed product catalogs, customer interaction data, and behavioral insights. By incorporating human feedback and integrating it with existing systems, the Virtual Shopping Assistant can be molded to reflect the retailer’s brand, tone, and values.

Synthetic Inventory Data Generation Boosting Agility and Insight

Managing inventory data is a complex and time-consuming task for retailers, often fraught with inconsistencies and challenges in scaling. Large Language Models (LLMs) can analyze extensive inventory data, identifying trends and patterns. This allows for the creation of realistic and relevant synthetic data without revealing sensitive information, providing both privacy and comprehensive testing capabilities.

With LLMs, retailers can gain control over the data generation process, enabling augmentation and diverse scenario creation. By fine-tuning LLMs with actual inventory data and incorporating human feedback, retailers can craft a system that aligns with their unique requirements. Generative AI’s ability to produce synthetic inventory data is not just a technological advancement; it’s a strategic asset that empowers retailers to be more agile, insightful, and effective.

Illustration for Synthetic Inventory Data Solution

Quick and Market-Aligned Product Generation

In the realm of retail, manual product development is a time-consuming and resource-intensive process. The challenges extend from heavy reliance on manual efforts by designers and stakeholders to the uncertainty in market success due to fluctuating customer demand, competition, and trends. The future state of product generation, however, offers transformative possibilities. By automating concept creation, design exploration, and prototyping, retailers can accelerate product development. This shift towards data-driven decision-making and key metrics identification further refines design choices and mitigates market risk.

Illustration for Product Generation Solution

The journey from concept to product can be streamlined through AI-driven stages such as generating product concepts, evaluating, refining, and iterating designs, and prototyping and testing. By leveraging customer data and market insights, retailers can create products that truly resonate with their audience. The ability to fine-tune the development process with actual market insights and human feedback aligns product creation with customer demand. This empowers retailers to be more innovative, efficient, and aligned with the ever-changing market landscape.

Generating Labels with Enhanced Accuracy, Brand Consistency, and Compliance

In the current retail landscape, generating labels is a labor-intensive process, marked by time-consuming efforts from graphic designers and product managers. Limited customization, error-prone procedures, and numerous iterations not only hinder efficiency but also pose risks to accuracy and compliance. This complexity impacts both time and flexibility, making label design a challenging task.

The future, however, presents an exciting transformation. Leveraging AI for rapid iterations, customization, and consistency opens doors to significant time and resource savings. The ability to offer scalability for large catalogs, ensure accuracy, maintain brand consistency, and comply with regulations is more than an efficiency gain; it’s a strategic advantage. By automating the design process and focusing on the creative aspects of label design, retailers can elevate their brand’s identity and engage with their audience in a more meaningful way.

Illustration for Generating Labels Solution

Personalized Promotion for Enhanced Customer Engagement

Creating personalized promotions has traditionally been a manual, error-prone process. Manual analysis and segmentation of customer data can lead to limited insights, inefficient promotion design, and static promotions that lack relevance. The challenges in uncovering subtle customer preferences make it difficult to deliver truly personalized experiences.

The future state of personalized promotion, driven by AI, offers a transformative approach. Automated customer segmentation, real-time personalization, and adaptive promotions bring accuracy and dynamism. This shift not only improves efficiency and maximizes ROI but also ensures a seamless and cohesive customer experience throughout the shopping journey. By focusing on real-time insights and multichannel personalization, retailers can connect with customers in more meaningful ways, enhancing engagement and loyalty.

Illustration for Personalized Promotion Solution

The emergence of Generative AI in retail signals a transformative era, offering immense potential to enhance every aspect of the retail value cycle. From creating more engaging customer experiences to optimizing supply chain management, the applications are vast and varied. Retail leaders who leverage these technologies can significantly improve operational efficiencies, personalize customer interactions, and stay agile in a dynamically evolving market. By harnessing the power of Generative AI, retailers are not just adapting to current trends; they are actively shaping the future of retail, paving the way for innovative approaches and sustainable growth in an increasingly digital world. Now is a pivotal moment for industry leaders to explore and invest in these advanced capabilities, ensuring they remain at the forefront of retail innovation and excellence.

The post Beyond Bargains: 7 Powerful Ways Retail Leaders Can Use Generative AI to Level Up Their Retail Value Cycle appeared first on Tiger Analytics.

]]>
A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics https://www.tigeranalytics.com/perspectives/blog/a-pharma-leaders-guide-to-driving-effective-drug-launches-with-commercial-analytics/ Wed, 10 Jan 2024 10:16:59 +0000 https://www.tigeranalytics.com/?post_type=blog&p=19499 Learn how pharma leaders can leverage Tiger Analytics’ Commercial Analytics engine to successfully launch new drugs in the market through enhanced data-driven insights and decision-making.

The post A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics appeared first on Tiger Analytics.

]]>
For a Pharmaceutical company, launching a drug represents the culmination of extensive research and development efforts. Across the typical stages of drug launch – planning the launch, the launch itself, and the post-launch drug lifecycle management, Data Analytics can guide pharmaceutical companies to leverage the power of data-driven insights and strategic analysis. How does this help? According to research, for 85% of pharmaceutical launches, the product trajectory is set in the first six months.

Real-time analytics enables informed decision-making, enhanced patient outcomes, and creates a competitive edge for the drug in the ever-evolving Healthcare industry. A data-driven approach across the drug lifecycle ensures that the drug launch is not just a milestone, but a stepping stone towards improved healthcare and a brighter future.

5 Benefits of a Data-Driven Drug Launch

How can Pharma leaders benefit from a data-driven launch? We’ve put together a few of our observations here:

5 Benefits of a Data-Driven Drug Launch

1. Precise Patient Targeting
Begin by identifying the most promising patient segments through comprehensive data analysis. By integrating electronic health records, prescription data, and demographic information, you can pinpoint the specific patient populations that will benefit most from your drug. Tailor your messaging and outreach to address their unique needs and preferences.

2. Segmented Marketing Strategies
Develop personalized marketing strategies for each identified patient segment. Utilize commercial analytics to understand the distinct characteristics of these segments and create tailored campaigns that resonate with their concerns. This approach enhances engagement and encourages a deeper connection between patients and your product.

3. Tactical Pricing Optimization
Determine the optimal pricing strategy for your drug by analyzing market dynamics, competitor pricing, and patient affordability. Commercial analytics helps strike the right balance between maximizing revenue and ensuring accessibility. Data-driven pricing decisions also enhance negotiations with payers and reimbursement discussions.

4. Multi-channel Engagement
Leverage commercial analytics to identify the most effective communication channels for reaching healthcare professionals and patients. Analyze historical prescription patterns and physician preferences to allocate resources to the channels that yield the highest impact. This approach ensures that your message reaches the right stakeholders at the right time.

5. Continuous Performance Monitoring
The launch doesn’t end on the day of the launch — it’s a continuous process. Utilize real-time data analytics to monitor your drug’s performance in the market. Track metrics such as prescription volume, market share, and patient feedback. This information helps you adapt your strategies as needed and capitalize on emerging opportunities.

Enabling a 360-Degree View of Pharma Drug Launch with Commercial Analytics

At Tiger Analytics, we developed a Data Analytics solution, tailored to meet the specific requirements of our clients in the Pharmaceutical industry. Our Commercial Analytics engine powers a suite of data-driven analytical interventions throughout the lifecycle of a drug. It serves as a bridge between goals and actionable insights, effectively transforming raw data into strategic decisions. The solution supports pre-launch patient segmentation and provider engagement. It also aids in launch-stage payer analytics and pharmacy optimization. Lastly, it enables post-launch patient journey analysis and outcomes assessment – giving Pharma leaders a 360-degree view of the entire launch cycle.

Here’s how it works:

Pre-Launch: Setting the Stage for Success

In this stage, the goal is to lay a strong foundation for success by developing the value proposition of the drug. Clinical teams, data strategists, and market researchers collaborate to assess the drug’s commercial potential and create a strategy to realize it. To begin, comprehensive surveys and market research are conducted to gain insights into healthcare personnel (HCP) behavior, competitor analysis, patient profiles, packaging analysis, price comparison, and sales benchmarks. These analyses shape the roadmap for the drug’s performance and enable the exploration of various scenarios through forecasting exercises. Patient profiling and segmentation strategies are also implemented to devise effective marketing and engagement strategies.

From Action to Impact

To drive tangible results, at Tiger Analytics we orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Conducted a comprehensive analysis of analog drugs in the market and performed market scoping along with other forecasting exercises to understand the potential impact of the new drug once launched.
  • Analyzed survey results and developed a tool to assess the possible effectiveness of the drug in real-world scenarios.
  • Formulated multiple scenario analyses to account for unprecedented events and their potential impact on the demand for the drug.

How did the solutions help?

  • Provided a clear view of the expected market landscape through market sizing.
  • Prepared the pharma company for unknown events through scenario analysis.
  • Facilitated target adjustment and improved planning by forecasting numbers.

How did the solutions help?

Launch: Strategic Payer Engagement in a Complex Landscape

During the drug launch, the focus shifts to accelerating drug adoption and reducing the time it takes to reach peak sales. At this juncture, analytics plays a crucial role in optimizing market access and stakeholder engagement (payers, prescribers, and patients). By analyzing payer data, claims information, and reimbursement policies, pharmaceutical companies gain insights for strategic decision-making, including formulary inclusion, pricing strategies, and reimbursement trends. These insights enable effective negotiations with payers, ensuring optimal coverage and patient access to the medication.

Monitoring sales and identifying early and late adopters among HCPs and patients enables targeted marketing activities and tailored promotional initiatives. This approach effectively propelled successful market penetration.

From Action to Impact

To drive tangible results, we, at Tiger Analytics, orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Implemented a robust email marketing campaign, targeting the identified early adopter HCPs.
  • Monitored HCP engagement and response to emails using advanced analytics and tracking tools.
  • Leveraged predictive models to conduct real-time analysis of promotional activities, optimizing their effectiveness and making data-driven adjustments.

How did the solutions help?

  • Achieved a 15% increase in HCP engagement and response rates.
  • Real-time analysis led to a 10% improvement in effectiveness.

How did the solutions help?

Post-Launch: Empowering Patient-Centric Care

Post-launch analytics focuses on monitoring the market and adapting to market dynamics (competition, regulations, reimbursements, etc.) to extend the drug’s lifecycle. Advanced analytics also enables understanding a patient’s journey and optimizing the person’s medication adherence. By leveraging real-world data, electronic health records, and patient-reported outcomes, pharmaceutical companies gain invaluable insights into patient behavior, adherence rates, and treatment patterns. These insights facilitate the development of personalized interventions, patient support programs, and targeted educational campaigns to enhance patient adherence and improve treatment outcomes. Additionally, continuous tracking of the medication’s performance, market share, and patient-reported outcomes enables pharmaceutical companies to make data-driven decisions, generate evidence for stakeholders, and drive ongoing improvements in patient care.

From Action to Impact

To drive tangible results, we, at Tiger Analytics, orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Utilized real-world data and electronic health records to track patient behavior and medication adherence.
  • Conducted in-depth analysis of patient-reported outcomes to gain insights into treatment patterns and efficacy.
  • Developed personalized interventions and patient support programs, based on the identified patterns and behaviors.

How did the solutions help?

  • Improved medication adherence by 25%.
  • Achieved a 30% increase in patient satisfaction and treatment compliance.

How did the solutions help?
For Pharmaceutical companies, the goal of a successful drug launch is not only about accelerating the medicine’s time to market, but it is also about ensuring patient awareness and access to life-saving drugs. By leveraging the power of data to fuel AI-enabled drug launches, we’ll continue to see better medication adherance, satisfied patients, compliance to treatments – which will ultimately lead to better health outcomes.

The post A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics appeared first on Tiger Analytics.

]]>
Why India-Targeted AI Matters: Exploring Opportunities and Challenges https://www.tigeranalytics.com/perspectives/blog/need-india-centric-ai/ Wed, 11 May 2022 13:42:19 +0000 https://www.tigeranalytics.com/?p=7604 The scope for AI-focused innovation is tremendous, given India’s status as one of the fastest-growing economies with the second-largest population globally. Explore the challenges and opportunities for AI in India.

The post Why India-Targeted AI Matters: Exploring Opportunities and Challenges appeared first on Tiger Analytics.

]]>
To understand the likely impact of India-centric AI, one needs to appreciate the country’s linguistic, cultural, and political diversity. Historically, India’s DNA has been so heterogeneous that extracting clear perspectives and actionable insights to address past issues, current challenges, and moving towards our vision as a country would be impossible without harnessing the power of AI.

The scope for AI-focused innovation is tremendous, given India’s status as one of the fastest-growing economies with the second-largest population globally. India’s digitization journey and the introduction of the Aadhaar system in 2010 – the largest biometric identity project in the world – has opened up new venues for AI and data analytics. The interlinking of Aadhaar with banking systems, the PDS, and several other transaction systems allows greater visibility, insights, and metrics that can be used to bring about improvements. Besides using these to raise the quality of lives of citizens while alleviating disparities, AI can support more proactive planning and formulation of policies and roadmaps. Industry experts concur a trigger and economic growth spurt, opining that “AI can help create almost 20 million jobs in India by 2025 and add up to $957 billion to the Indian economy by 2035.”

The current state of AI in India

The Indian government, having recently announced the “AI for All” strategy, is more driven than ever to nurture core AI skills to future-proof the workforce. This self-learning program looks to raise awareness levels about AI for every Indian citizen, be it a school student or a senior citizen. It targets meeting the demands of a rapidly emerging job market and presenting opportunities to reimagine how industries like farming, healthcare, banking, education, etc., can use technology. A few years prior, in 2018, the government had also increased its funding towards research, training, and skilling in emerging technologies by 100% as compared to 2017.

The booming interest has been reflected in the mushrooming of boutique start-ups across the country, as well. With a combined value of $555 million, it is more than double the previous year’s figure of $215 million. Interestingly, analytics-driven products and services contribute a little over 64% of this market -clocking over $355 million. In parallel, the larger enterprises are taking quantum leaps to deliver AI solutions too. Understandably, a large number of them use AI solutions to improve efficiency, scalability, and security across their existing products and services.

Current challenges of making India-centric AI

There is no doubt that AI is a catalyst for societal progress through digital inclusion. And in a country as diverse as India, this can set the country on an accelerated journey toward socio-economic progress. However, the socio, linguistic and political diversity that is India also means more complex data models that can be gainfully deployed within this landscape. For example, NLP models would have to adapt to text/language changes within just a span of a few miles! And this is just the tip of the iceberg as far as the challenges are concerned.

Let’s look at a few of them:

  • The deployment and usage of AI have been (and continues to be) severely fragmented without a transparent roadmap or clear KPIs to measure success. One of the reasons is the lack of a governing body or a panel of experts to regulate, oversee and track the implementation of socio-economic AI projects at a national level. But there’s no avoiding this challenge, considering that the implications of AI policy-making on Indian societies may be irreversible.
  • The demand-supply divide in India for AI skills is huge. The government initiatives such as Startup India as well as the boom in AI-focused startups have only contributed to extending this divide. The pace of getting a trained workforce to cater to the needs of the industry is accelerating but unable to keep up with the growth trajectory that the industry finds itself in. Large, traditionally run institutions are also embracing AI-driven practices having witnessed the competitive advantage it brings to the businesses. This has added to the scarcity that one faces in finding good quality talent to serve today’s demand.
  • The lack of data maturity is a serious roadblock on the path to establishing India-centric AI initiatives – especially with quite a few region-focused datasets being currently unavailable. There is also a parity issue with quite a few industry giants having access to large amounts of data as compared to the government, let alone start-ups. There is also the added challenge of data quality and a single source of truth that one can use for AI model development
  • Even the fiercest AI advocates would admit that its security challenges are nowhere close to being resolved. There is a need for security and compliance governance protocols to be region-specific so that unique requirements are met and yet there is a generalisability that is required to rationalize these models at the national level.
  • There is also a lot of ongoing debate at a global level on defining the boundaries that ethical AI practices will need to lean on. Given India’s diversity, this is a challenge that is magnified many times over

Niche areas where AI is making an impact

Farming

The role of AI in modern agricultural practices has been transformational – this is significant given that more than half the population of India depends on farming to earn a living. In 2019-2020 alone, over $1 billion was raised to fuel agriculture-food tech start-ups in India. It has helped farmers generate steadier income by managing healthier crops, reducing the damage caused by pests, tracking soil and crop conditions, improving the supply chain, eliminating unsafe or repetitive manual labor, and more.

Healthcare

Indian healthcare systems come with their own set of challenges – from accessibility and availability to quality and poor awareness levels. But each one represents a window of opportunity for AI to be a harbinger of change. For instance, AI-enabled platforms can extend healthcare services to low-income or rural areas, train doctors and nurses, address communication gaps between patients and clinicians, etc. Government-funded projects like NITI Aayog and the National Digital Health Blueprint have also highlighted the need for digital transformation in the healthcare system.

BFSI

The pandemic has accelerated the impact of AI on the BFSI industry in India, with several key processes undergoing digital transformation. The mandatory push for contactless remote banking experience has infused a new culture of innovation in mission-critical back-end and front-end operations. A recent PwC-FICCI survey showed that the banking industry has the country’s highest AI maturity index – leading to the deployment of the top AI use cases. The survey also predicted that Indian banks would see “potential cost savings up to $447 billion by 2023.”

E-commerce

The Indian e-commerce industry has already witnessed big numbers thanks to AI-based strategies, particularly marketing. For retail brands, capturing market share is among the toughest worldwide – with customer behavior being driven by a diverse set of values and expectations. By using AI and ML technologies – backed by data science – it would be easier to tap into multiple demographics without losing the context of messaging.

Manufacturing

Traditionally, the manufacturing industry has been running with expensive and time-consuming manually driven processes. Slowly, more companies realize the impact of AI-powered automation on manufacturing use cases like assembly line production, inventory management, testing and quality assurance, etc. While still at a nascent stage, AR and VR technologies are also seeing adoption in this sector in use cases like prototyping and troubleshooting.

3 crucial data milestones to achieve in India’s AI journey

1) Unbiased data distribution

Forming India-centric datasets starts with a unified framework across the country so that no region is left uncovered. This framework needs to integrate with other systems/data repositories in a secure and seamless manner. Even private companies can share relevant datasets with government institutions to facilitate strategy and policy-making.

2) Localized data ownership

In today’s high-risk data landscape, transferring ownership of India-centric information to companies in other countries can lead to compliance and regulatory problems. Especially when dealing with industries with healthcare or public administration, it is highly advised to maintain data control within the country’s borders.

3) Data ethics and privacy

Data-centric solutions that work towards improving human lives require a thorough understanding of personal and non-personal data, matters of privacy, and infringement among others. The responsible aspect to manage this information takes the challenges beyond the realms of deployment of a mathematical solution. Building an AI mindset that raises difficult questions about ethics, policy, and law, and ensures sustainable solutions with minimized risks and negative impact is key. Plus, data privacy should continue to be a hot button topic, with an uncompromising stance on safeguarding the personal information of Indian citizens.

Final thoughts

India faces a catch-22 situation with one side of the country still holding to its age-old traditions and practices. The other side embraces technology change, be it using UPI transfers, QR codes, or even the Aarogya Setu app. But sheer size and diversity of languages, cultures, and politics dictate that AI will neither fail to find areas to cause a profound impact nor face fewer challenges while implementing it.

As mentioned earlier, the thriving startup growth adds a lot of fuel to AI’s momentum. From just 10 unicorns in India in 2018, we have grown to 38. This number is expected to increase to 62 by 2025. In 2020, AI-based Indian startups received over $835 million in funding and are propelling growth few countries can compete with. AI is a key vehicle to ring in the dawn of a new era for India-centric AI– an India which despite the diversity and complex landscape, leads the way in the effective adoption of AI.

This article was first published in Analytics India Magazine.

The post Why India-Targeted AI Matters: Exploring Opportunities and Challenges appeared first on Tiger Analytics.

]]>
Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/ https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/#comments Thu, 24 Mar 2022 12:43:31 +0000 https://www.tigeranalytics.com/?p=7314 The presence of legacy systems, regulatory compliance issues and sudden growth of the BEV/PHEV market are all challenges the automotive industry must face. Explore how Analytics can help future-proof their growth plans.

The post Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market appeared first on Tiger Analytics.

]]>
In an age when data dictates decision-making, from cubicles to boardrooms, many auto dealers worldwide continue to draw insights from past experiences. However, the automotive market is ripe with opportunities to leverage data science to improve operational efficiency, workforce productivity, and consequently – customer loyalty.

Data challenges faced by automotive dealers

There are many reasons why auto dealers still struggle to collect and use data. The biggest one is the presence of legacy systems that bring entangled processes with disparate data touchpoints. This makes it difficult to consolidate information and extract clean, structured data – especially when there are multiple repositories. More importantly, they are unable to derive and harness actionable insights to improve their decision-making capabilities, instead of merely relying on gut instincts.

In addition, the sudden growth of the BEV/PHEV market has proven to complicate matters – with increasing pressure on regulatory compliance.

But the reality is that future-ready data management is a must-have strategy – not just to thrive but even to survive today’s automotive market. The OEMs are applying market pressure on one side of the spectrum – expecting more cost-effective vehicle pricing models to establish footprints in smaller or hyper-competitive markets. On the other side, modern customers are making it abundantly clear that they will no longer tolerate broken, inefficient, or repetitive experiences. And if you have brands operating in different parts of the world, data management can be a nightmarishly time-consuming and complex journey.

Future-proofing the data management strategy

Now, it’s easier said than done for the automotive players to go all-in on adopting a company-wide data mindset. It is pertinent to create an incremental data-driven approach to digital transformation that looks to modernize in phases. Walking away from legacy systems with entangled databases means that you must be assured of hassle-free deployment and scalability. It can greatly help to prioritize which markets/OEMs/geographies you want to target first, with data science by your side.

Hence, the initial step is to assess the current gaps and challenges to have a clear picture of what needs to be fixed on priority and where to go from thereon. Another key step in the early phase should be to bring in the right skill sets to build a future-proofed infrastructure and start streamlining the overall flow of data.

It is also important to establish a CoE model to globalize data management from day zero. In the process, a scalable data pipeline should be built to consolidate information from all touchpoints across all markets and geographies. This is a practical way to ensure that you have an integrated source of truth that churns out actionable insights based on clean data.

You also need to create a roadmap so that key use cases can be detected with specific markets identified for initial deployment. But first, you must be aware of the measurable benefits that can be unlocked by tapping into the power of data.

  • Better lead scoring: Identify the leads most likely to purchase a vehicle and ensure targeted messaging.
  • Smarter churn prediction: Identify aftersales customers with high churn propensity and send tactical offers.
  • Accurate demand forecasting: Reduce inventory days, avoid out-of-stock items, and minimize promotional costs.
  • After-sales engagement: Engage customers even after the initial servicing warranty is over regarding repairs, upgrades, etc. as well an effective parts pricing strategy.
  • Sales promo assessment: Analyze historical sales data, seasonality/trends, competitors, etc., to recommend the best-fit promo.
  • Personalized customer engagement: Customize interactions with customers based on data-rich actionable intelligence instead of unreliable human instincts.

How we helped Inchcape disrupt the automotive industry

When Tiger Analytics began the journey with Inchcape, a leading global automotive distributor, we knew that it was going to disrupt how the industry tapped into data. Fast-forward to a year later, we were thrilled to recently take home Microsoft’s ‘Partner of the Year 2021’ award in the Data & AI category. What started as a small-scale project grew into one of the largest APAC-based AI and Advanced Analytics projects. We believe that this project has been a milestone moment for the automotive industry at large. If you’re interested in finding out how our approach raised the bar in a market notorious for low data adoption, please read our full case study.

The post Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/feed/ 1
When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/ https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/#comments Thu, 27 Jan 2022 10:26:37 +0000 https://www.tigeranalytics.com/?p=6933 The BPO industry has embraced analytics to optimize profitability, efficiency, and customer satisfaction. This blog delves into the specifics of data utilization, unique challenges, and key business areas where analytics can make a difference.

The post When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry appeared first on Tiger Analytics.

]]>
Around 1981, the term outsourcing entered our lexicons. Two decades later, we had the BPO boom in India, China, and the Philippines with every street corner magically sprouting call centers. Now, in 2022, the industry is transitioning into an era of analytics, aiming to harness its sea of data for profitability, efficiency, and improved customer experience.

In this blog, we delve into details of what this data is, the unique challenges it poses, and the key business areas that can benefit from the use of analytics. We also share our experiences in developing these tools and how they have helped our clients in the BPO industry.

The Information Ocean

The interaction between BPO agents and customers generates huge volumes of both structured and unstructured (text, audio) data. On the one hand, you have the call data that measures metrics such as the number of incoming calls, time taken to address issues, service levels, and the ratio of handled vs abandoned calls. On the other hand, you have customer data measuring satisfaction levels and sentiment.

Insights from this data can help deliver significant value for your business whether it’s around more call resolution, reduced call time & volume, agent & customer satisfaction, operational cost reduction, growth opportunities through cross-selling & upselling, or increased customer delight.
The trick is to find the balance between demand (customer calls) and supply (agents). An imbalance can often lead to revenue losses and inefficient costs and this is a dynamic that needs to be facilitated by processes and technology.

Challenges of Handling Data

When you are handling such sheer volumes of data, the challenges too can be myriad.
Our clients wage a daily battle with managing these vast volumes, harmonizing internal and external data, and driving value through them. For those that have already embarked on their analytical journey, the primary goals are finding the relevance of what they built, driving scalability, and leveraging new-age predictive tools to drive ROI.

Delivering Business Value

Based on our experience, the business value delivered from advanced Analytics in the BPO industry is unquestionable, exhaustive and primarily influences these key aspects:

1) Call Management

Planning agent resources based on demand (peak and off-peak) and skillsets accounting for how long they take to resolve issues can impact business costs. AI can help automate the process to help optimize costs We have built an automated and real-time scheduling and resource optimization tool that has led one of our BPO clients to a cost reduction of 15%.

2) Customer Experience

Call center analytics give agents access to critical data and insights to work faster and smarter, improve customer relationships and drive growth. Analytics can help understand the past behavior of a customer/similar customers and recommend products or services that will be most relevant, instead of generic offers. It can also predict which customers are likely to need proactive management. Our real-time cross-selling analytics has led to a 20% increase in revenue.

3) Issue Resolution

First-call resolution refers to the percentage of cases that are resolved during the first call between the customer and the call center. Analytics can help automate the categorization process of contact center data by building a predictive model. This can help with a better customer servicing model achieved by appropriately capturing the nuances of customer chats with contact centers. This metric is extremely important as it helps in reducing the customer churn rate.

4) Agent Performance

Analytics on call-center agents can assist in segmenting those who had a low-resolution rate or were spending too much time on minor issues, compared with top-performing agents. This helps the call center resolve gaps or systemic issues, identify agents with leadership potential, and create a developmental plan to reduce attrition and increase productivity.

5) Call Routing

Analytics-based call routing is based on the premise that records of a customer’s call history or demographic profile can provide insight into which call center agent(s) has the right personality, conversational style, or combination of other soft skills to best meet their needs.

6) Speech Analytics

Detecting trends in customer interactions and analyzing audio patterns to read emotions and stress in a speaker’s voice can help reduce customer churn, boost contact center productivity, improve agent performance and reduce costs by 25%. Our tools have clients in predicting member dissatisfaction to achieve a 10% reduction in first complaints and 20% reduction in repeat complaints.

7) Chatbots and Automation

Thanks to the wonders of automation, we can now enhance the user experience to provide personalized attention to customers available 24/7/365. Reduced average call duration and wage costs improve profitability. Self-service channels such as the help center, FAQ page, and customer portals empower customers to resolve simple issues on their own while deflecting more cases for the company. Our AI-enabled chatbots helped in strengthening engagement and quicker resolutions of 80% of user queries.

Lessons from The Philippines

Recently, in collaboration with Microsoft, we conducted a six-week Data & Analytics Assessment for a technology-enabled outsourcing firm in the Philippines. The client was encumbered by complex ETL processes, resource bottlenecks on legacy servers, and a lack of UI for troubleshooting leading to delays in resolution and latency issues. They engaged Tiger Analytics to assess their data landscape.

We recommended an Enterprise Data Warehouse modernization approach to deliver improved scalability & elasticity, strengthened data governance & security, and improved operational efficiency.

We did an in-depth assessment to understand the client’s ecosystem, key challenges faced, data sources, and their current state architecture. Through interactions with IT and business stakeholders, we built a roadmap for a future state data infrastructure that would enable efficiency, scalability, and modernization. We also built a strategic roadmap of 20+ analytics use cases with potential ROI across HR and contact center functions.

The New Era

Today, the Philippines has been recognized as the BPO capital of the world. The competition will toughen both from new players and existing ones. A digital transformation is underway in the BPO industry. Success in this competitive space lies with companies that will harness the huge volume of data they have into meaningful and actionable change.

The post When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/feed/ 187
Consulting with Integrity: ‘Responsible AI’ Principles for Consultants https://www.tigeranalytics.com/perspectives/blog/consulting-with-integrity-responsible-ai-principles-for-consultants/ Wed, 05 Jan 2022 15:22:35 +0000 https://www.tigeranalytics.com/?post_type=blog&p=19156 Third-party AI consulting firms engaged in multiple stages of AI development must point out any ethical red flags to their clients at the right time. This article delves into the importance of a structured ethical AI development process.

The post Consulting with Integrity: ‘Responsible AI’ Principles for Consultants appeared first on Tiger Analytics.

]]>
AI goes rogue and decimates or enslaves humanity — the internet is full of such horrendous fictional movies. The fictional AI risk may be far-fetched, but the current state of Narrow AI could soon have a profound impact on humanity. AI developers and leaders around the world have an ethical obligation toward society. They have a responsibility to create a system suited for the benefit of society and the environment surrounding it.

AI could go wrong in many ways and have unintended consequences in the shorter or longer term. In a certain case, an AI algorithm was found to unintentionally reinforce racial bias when it predicted lower health risk scores for people of color. It turned out that the algorithm was using patients’ historical healthcare spending to model future health risks. As this bias perpetuates through the algorithm in operation, it becomes like a disastrous self-fulfilling prophecy leading to healthcare disparity.

In another incident, Microsoft had to bear the brunt when Tay — its millennial chatbot — engaged in trash talk on social media and had to be taken offline within 16 hours of going live.

Only the juiciest stories make it to the front page of news, but the ethical conundrum runs deep for any organization building AI-driven applications. Leading organizations have concurred on the very core principles for the ethical development of AI — Fairness, Safety, Privacy, Security, Interpretability, and Inclusiveness. Numerous product-led companies champion the need for a responsible AI with a human-centric approach. But, these products are not built entirely by a single team. Many a time, the use of multiple pre-packaged software brings the AI use case to fruition. In some other cases, it involves specialized AI consulting companies to bring in bespoke solutions, capabilities, datasets, or skill sets to complement the speed and scale of AI development.

As third-party AI consulting firms are involved in the various phases of AI development — data gathering and data wrangling, model training, and building, and finally, model deployment and adoption — it is crucial for them to understand the reputational implications of even a mildly rouge AI for their clients. Without certain systems in place, AI development teams scramble to solve the issues as they come, brewing a regulatory and humanitarian storm. In such a situation, it is imperative for these consulting or vendor organizations to follow a certain process for ethical AI development. Some of the salient points of such a process should be:

1. Recognize and flag an AI ethical issue early.

We can solve ethical dilemmas only if we have the mechanisms to recognize one. A key step at the beginning of any AI ethical quandary is locating and isolating ethical aspects of the issue. This involves educating the employees and consultants alike toward AI ethics sensitivity. Experienced data modelers in the team should have the eye to identify any violations of the core ethical principles in any of their custom-made solutions.

2. Documentation helps you trace unethical behavior.

Documenting how the key AI services operate, are trained, their performance metrics, fairness, robustness, and their systemic biases goes a long way in avoiding ethical digression. The devil is in the details, and the details are captured better by documentation.

3. Work in tandem with the client’s team to understand business-specific ethical risks within AI.

Similar industries share a theme across their AI risks. A healthcare or banking company must build extra guard rails around probable violations of privacy and security. E-commerce companies, pioneers in creating state-of-the-art recommendation engines, must keep their ears and eyes open to mitigate any kind of associative bias leading to stereotypical associations within certain populations. Identifying such risks narrows the search for probable violations.

4. Use an ethical framework like the Consequentialist Framework for an objective assessment of ethical decision-making.

 A consequential framework evaluates an AI project by looking at its outcomes. Such frameworks help teams meditate over probable ethical implications. For example, a self-driving AI that has even a remote possibility of being unable to recognize pedestrians covered with facemasks could be fatal and shouldn’t ever make it to the markets.

5. Understand the trade-off between accuracy, privacy, and bias at different stages of model evaluation.

Data scientists must be cognizant of the fact that their ML models should be optimized not only for best performance and high accuracy but also for lower (unwanted) bias. Like any other non-binary decision, leaders should be aware of this trade-off too. Fairness metrics and bias mitigation tool kits like the IBM’s AI Fairness 360 could be used to mitigate unwanted bias in datasets and models.

6. Incentivize open-source and white-box approaches.

An open-source and explainable AI approach is crucial in establishing trust between vendors and clients. It ensures that the system is working as expected and any anomalies can be backtracked to the precise code or data item that might have originated it. Ease of regulatory compliance with open-source approaches makes it a favorite in the financial services and healthcare sector.

7. Run organizational awareness initiatives.

An excellent data scientist may not be aware of the ethical implication of autonomous systems. Organizational awareness, adequate training, and a robust mechanism to bring forth any AI risks should be inculcated into culture and values. Employees should be incentivized to escalate the tiniest of such situations. An AI ethics committee should be formed to provide broader guidance to on-ground teams regarding grey areas.

Final Thoughts

A successful foundation to each of these steps is smooth coordination and handshake between vendor and client teams with a responsible common vision. Vendors should not hesitate to bring forth any AI ethical risks that they might be running for their clients. Clients, meanwhile, should involve their strategic vendors in such discussions and training. Whistleblowers for AI ethical risks might be analysts and data scientists, yet it won’t be possible for them to flag those issues unless there is a top-down culture that encourages them to look for it.

The post Consulting with Integrity: ‘Responsible AI’ Principles for Consultants appeared first on Tiger Analytics.

]]>
Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry https://www.tigeranalytics.com/perspectives/blog/advanced-analytics-ai-telecom/ Thu, 23 Dec 2021 11:49:32 +0000 https://www.tigeranalytics.com/?p=5958 Explore how Analytics is helping the Telecom industry uncover growth opportunities for customer acquisition, while simultaneously growing the value of existing customers.

The post Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry appeared first on Tiger Analytics.

]]>
There is rich and abundant data available in the telecom sector, and this data has been especially relevant in the last two years. Bandwidth consumption reached an all-time high amid the global health crisis, as all businesses and educational institutions moved towards a digital workspace model.

However, despite this shift to digital-first, some key challenges have led to a dip in growth in the sector. These challenges include:

  • Intense pricing competition across the sector from both legacy players and newcomers that are offering unique business models.
  • Increasing adoption of services from OTT providers (Ex: WhatsApp for voice calls, Messenger for messaging, etc.).
  • Raising capital expenditures to set up new infrastructure to provide improved connectivity and 5G services.

In this article, we will discuss the top growth opportunities for the telecom sector in acquiring new customers, while simultaneously growing the value of existing customers:

  • Customer 360-view to enable targeted growth from the existing customer base
  • Customer retention
  • Customer service experience
  • Capitulating on the growth of the B2B segment

Customer 360-view: Why it matters

Customer 360-view, as the name suggests, is about the all-round picture. It provides a comprehensive understanding of customers by aggregating data from all touchpoints. This data traces the customer’s journey across various departments, all on one central platform.

We can further augment internal data sources with structured and unstructured external data sources, such as business profiles, demographic reports, social media data, etc. This rich external data is usually stored in silos, or, unfortunately, never used.

Companies tend to shy away from adopting the Customer 360-view because of the challenges it presents. One common one is the difference in entity names used in various internal systems and third-party data sources. Here is where implementing AI-based string matching algorithms has been helpful in merging multiple disparate sources.

Similar to the example above, solutions can be found for companies struggling to implement the Customer 360-view because its advantages definitely trump the challenges. Let’s look at some of the advantages:

  • Unified view of customers across all departments — from business development to customer support
  • Scalable data that can be processed faster and at a minimal cost
  • Enabling AI and analytics-use cases (not exhaustive) such as:
  • ai_telecom_use-cases_new

  • Accurate external data augmentation has led to better features and thus improved accuracies in predictive models and improved understanding of customer behavior

Customer retention through churn prediction

The cost of customer retention is much lower than the cost of new customer acquisition

The offering of voice, messaging, mobile music, and video services by OTT providers such as WhatsApp, Messenger, Netflix, and Spotify, etc., have made data the primary offering for telecom companies.

Customers are spoilt for choice due to the ongoing price wars and data-heavy plans with competitive pricing. While the basic product is the same or with very few differences, the competition is high and the options plenty. This has led to an increase in customer churn.

Hence, it is crucial for telecom companies to understand the reasons for this customer churn, and predict paths that lead to an increase in customer churn.

One way to go about this is via machine learning models that are able to predict customer churn. This can be done using customers’ past transactions, network quality, product pricing, product usage, customer feedback history, complaints log, demographics, and social media data (if any).

Targeting the right customers to carry out retention campaigns is key. Those picked will be directly related to the campaign budget, cost of retention of each customer, and the incremental revenue generated through each customer.

This process is especially important because even retaining a small percentage of the customers who are about to churn can lead to increased revenue impact in the long run.

Customer service transformation

If the products being offered are similar and the competition is high, how does one differentiate between telecom operators? The answer is customer service. 

In this digital-first world, there is an increasing demand for the transformation of customer experience and the adoption of new technology, such as AI-enabled chatbots and dialogue systems.

One common challenge is providing the customer with all the right information regarding the product they are about to purchase. Often, customer service officers handle a range of products, and may not be equipped to handle all the customers’ questions. This increases the time customers spend on hold or in queues, which leads to dissatisfaction.

Here is where AI-enabled intelligent customer service systems can reduce waiting time and help in providing the most relevant solutions or recommendations to customers. This can be done in one or more ways:

  • Forecasting inbound call volumes for optimizing short and long-term staffing and training.
  • Employing virtual assistants to provide fast resolutions provide fast resolutions to all straightforward customer queries and redirect the rest to appropriate customer care agents
  • Enabling the representative with a customer 360-view helps them understand the customer query and background without getting a lot of inputs from the customer.
  • Enabling the team with a real-time analytics engine for recommending the right offer/product to an existing customer based on their profile, demographics, and interaction with the agent.

The growth of the telecom B2B segment to be driven by digitalization and 5G

The B2B business model enjoys high margins (compared to B2C), with customers willing to pay more for different services. It is characterized by a highly diverse list of products, customized solutions, pricing, and multiple partners. On the downside, this increases the length of the sales cycle.

One common growth use case (apart from common telecom use cases discussed above), specific to the B2B segment, is reducing the sales cycle time by using AI and analytics in product solution and pricing. This leads to a better customer experience, thus increasing customer acquisition.

The following are the main differences in characteristics of the two segments:

Differences in characteristics of the two segments: B2B and B2C

Differences in characteristics of the two segments: B2B and B2C

Historically, most telecom providers have prioritized analytics use cases to capture growth in the B2C segment¹. However, with the advent of digitalization, all businesses are relying on the telecom industry for reliable high-speed 5G data and corporate mobile plans. It is estimated that by 2035, sales amounting to USD 13.2 trillion will be enabled by the 5G ecosystem².

As a result, the next decade will likely see the B2B segment growing much faster than the B2C segment. Concentrating on B2B use cases will help telecom companies grab a bigger share of the growing market.

Benefits of implementing AI and Advanced Analytics (examples)

To really understand how AI and analytics are helping transform this booming sector, let’s look at some real-world examples.

Customer 360: Data governance system for an Asian OTT video service provider

Problem: The client was looking forward to developing a comprehensive understanding of the user’s program viewing behavior for smarter programming and advertising decisions.

Solution: The solution was to build a data lake to process internal and third-party data from structured and unstructured data sources. Some key challenges included creating a data governance process, handling inconsistencies across multiple sources, and building a flexible system that allows new data sources.

Value delivered: The outcome of the exercise was a data lake that could process 100 GB of data volume daily with varying velocities, ensuring data availability for data analytics projects across multiple themes.

The following are select case studies, executed using Customer 360-view datasets:

Customer 360-view datasets Case study

Customer 360-view datasets Case study

Churn Prediction – User Behavior Prediction Model driving USD 4.5 MM annual revenue

Problem: The client, a telecom giant, wanted to identify customers most likely to churn in their video-on-demand (VOD) business.

Solution: The key challenges were huge data volume, limited metadata on VOD content, constantly changing user base, and limited subscriber demographic information. The solution involved building a random forest-based churn classification model based on the features extracted from past customer RFM behavior, rate of change in purchases month-on-month, demographics, and content metadata.

Value delivered: A total of 73.4 % of potential churners were captured by the top 50% of the population flagged off by the model, leading to revenue retention of up to USD 4.5 MM per annum.

Customer service transformation case studies

Customer service transformation case study

Customer service transformation case study

Telecom B2B – Pricing system for a leading Asian telecom company

Problem: The client was looking to shorten their B2B product sales cycle, which currently took up to 4+ weeks to produce the initial quotation.

Solution: The bottleneck in the process was identified as the involvement of third-party costs and the delay in receiving them. The solution involved building ML models in predicting third-party expenses to reduce the waiting time and provide customers with an initial quote.

Value delivered: The business impact was reduced turnaround time for an initial quote from four weeks to a maximum of one day.

The future is brighter, smarter, quicker

The applications of AI and predictive analytics in the telecom sector are endless. With digital transformation being the key goal for any company today, combining AI and analytics can not only help in delivering superior performance but also give a company that touch of uniqueness needed to survive in a cut-throat market.

For more information on services and use cases, please get in touch with us at https://www.tigeranalytics.com/
References:
1. https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-b2b-analytics-playbook-capturing-unrealized-potential-in-telcos
2. https://www.qualcomm.com/media/documents/files/ihs-5g-economic-impact-study-2019.pdf
The article was first published in Analytics India Magazine- https://analyticsindiamag.com/advanced-analytics-and-ai-in-telecom-notes-from-tiger-analytics/

The post Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry appeared first on Tiger Analytics.

]]>
Data Science Strategies for Effective Process System Maintenance https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/ https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/#comments Mon, 20 Dec 2021 16:42:57 +0000 https://www.tigeranalytics.com/?p=6846 Industry understanding of managing planned maintenance is fairly mature. This article focuses on how Data Science can impact unplanned maintenance, which demands a differentiated approach to build insight and understanding around the process and subsystems.

The post Data Science Strategies for Effective Process System Maintenance appeared first on Tiger Analytics.

]]>
Data Science applications are gaining significant traction in the preventive and predictive maintenance of process systems across industries. A clear mindset shift has made it possible to steer maintenance from using a ‘reactive’ (using a run-to-failure approach) to one that is proactive and preventive in nature.

Planned or scheduled maintenance uses data and experiential knowledge to determine the periodicity of servicing required to maintain the plant components’ good health. These are typically driven by plant maintenance teams or OEMs through maintenance rosters and AMCs. Unplanned maintenance, on the other hand, occurs at random, impacts downtime/production, safety, inventory, customer sentiment besides adding to the cost of maintenance (including labor and material).

Interestingly, statistics reveal that almost 50% of the scheduled maintenance projects are unnecessary and almost a third of them are improperly carried out. Poor maintenance strategies are known to cost organizations as much as 20% of their production capacity – shaving off the benefits that a move from reactive to preventive maintenance approach would provide. Despite years of expertise available in managing maintenance activities, unplanned downtime impacts almost 82% of businesses at least once every three years. Given the significant impact on production capacity, aggregated annual downtime costs for the manufacturing sector are upwards of $50 billion (WSJ) with average hourly costs of unplanned maintenance in the range of $250K.

It is against this backdrop that data-driven solutions need to be developed and deployed. Can Data Science solutions bring about significant improvement of the maintenance domain and prevent any or all of the above costs? Are the solutions scalable? Do they provide an understanding of what went wrong? Can they provide insights into alternative and improved ways to manage planned maintenance activities? Does Data Science help reduce all types of unplanned events or just a select few? These are questions that manufacturers need to be answered and it is for the experts from both maintenance and data science domains to address.

Industry understanding of managing planned maintenance is fairly mature. The highlight of this article is therefore focused on unplanned maintenance, which demands a differentiated approach to build insight and understanding around the process and subsystems.

Data Science solutions are accelerating the industry’s move towards ‘on-demand’ maintenance wherein interventions are made only if and when required. Rather than follow a fixed maintenance schedule, data science tools can now aid plants to increase run lengths between maintenance cycles in addition to improving plant safety and reliability. Besides the direct benefits that result in reduced unplanned downtime and cost of maintenance, operating equipment at higher levels of efficiency improves the overall economics of operation.

The success of this approach was demonstrated in refinery CDU preheat trains that use soft sensing triggers to decide when to process ‘clean crude’ (to mitigate the fouling impact) or schedule maintenance of fouled exchangers. Other successes were in the deployment of plant-wide maintenance of control valves, multiple-effect evaporators in plugging service, compressors in petrochemical service, and a geo-wide network of HVAC systems.

Instead of using a fixed roster for maintenance of PID control valves, plants can now detect and diagnose control valves that are malfunctioning. Additionally, in combination with domain and operations information, it can be used to suggest prescriptive actions such as auto-tuning of the valves, which improve maintenance and operations metrics.

Reducing unplanned, unavoidable events

It is important to bear in mind that not all unplanned events are avoidable. The inability to avoid events could be either because they are not detectable enough or because they are not actionable. The latter could occur either because the response time available is too low or because the knowledge to revert a system to its normal state does not exist. A large number of unplanned events however are avoidable, and the use of data science tools improves their detection and prevention with greater accuracy.

The focus of the experts working in this domain is to reduce unplanned events and transition events from unavoidable to avoidable. Using advanced tools for detection, diagnosis, and enabling timely actions to be taken, companies have managed to reduce their downtime costs significantly. The diversity of solutions that are available in the maintenance area covers both plant and process subsystems.

Some of the data science techniques deployed in the maintenance domain are briefly described below:

Condition Monitoring
This has been used to monitor and analyze process systems over time, and predict the occurrence of an anomaly. These events or anomalies could have short or long propagation times such as the ones seen in the fouling in exchangers or in the cavitation in pumps. The spectrum of solutions in this area includes real-time/offline modes of analysis, edge/IoT devices and open/closed loop prescriptions, and more. In some cases, monitoring also involves the use of soft sensors to detect fouling, surface roughness, or hardness – these parameters cannot be measured directly using a sensor and therefore, need surrogate measuring techniques.

Perhaps one of the most unique challenges working in the manufacturing domain is in the use of data reconciliation. Sensor data tend to be spurious and prone to operational fluctuations, drift, biases, and other errors. Using raw sensor information is unlikely to satisfy the material and energy balance for process units. Data reconciliation uses a first-principles understanding of the process systems and assigns a ‘true value’ to each sensor. These revised sensor values allow a more rigorous approach to condition monitoring, which would otherwise expose process systems to greater risk when using raw sensor information. Sensor validation, a technique to analyze individual sensors in tandem with data reconciliation, is critical to setting a strong foundation for any analytics models to be deployed. These elaborate areas of work ensure a greater degree of success when deploying any solution that involves the use of sensor data.

Fault Detection
This is a mature area of work and uses solutions ranging from those that are driven entirely by domain knowledge, such as pump curves and detection of anomalies thereof, to those that rely only on historical sensor/maintenance/operations data for analysis. An anomaly or fault is defined as a deviation from ‘acceptable’ operation but the context and definitions need to be clearly understood when working with different clients. Faults may be related to equipment, quality, plant systems, or operability. A good business context and understanding of client requirements are necessary for the design and deployment of the right techniques. From basic tools that use sensor thresholds, run charts, and more advanced techniques such as classification, pattern analysis, regression, a wide range of solutions can be successfully deployed.

Early Warning Systems
The detection of process anomalies in advance helps in the proactive management of abnormal events. Improving actionability or response time allows faults to be addressed before setpoints/interlocks are triggered. The methodology varies across projects and there is no ‘one-size-fits-all’ approach. Problem complexity could range from using single sensor information as lead indicators (such as using sustained pressure loss in a vessel to identify a faulty gasket that might rupture) to far more complex methods of analysis.

Typical challenges faced in developing early warning systems are in the 100% detectability of anomalies but an even larger challenge is in filtering out false indications of anomalies. The detection of 100% of the anomalies and the robust filtering techniques are critical factors to consider for successful deployment.

Enhanced Insights for Fault Identification
The importance of detection and response time in the prevention of an event cannot be overstated. But what if an incident is not easy to detect or the propagation of the fault is too rapid to allow us any time for action? The first level involves using machine-driven solutions for detection such as computer vision models, which are rapidly changing the landscape. Using these models, it is now possible to improve prediction accuracies of processes that were either not monitored or used manual monitoring. The second is to integrate the combined expertise of personnel from various job functions such as technologists, operators, maintenance engineers, and supervisors. At this level of maturity, the solution is able to baseline with the best that current operations aim to achieve. The third and by far the most complex is to move more faults in the ‘detectable’ and actionable realm. One such case was witnessed in a complex process from the metal smelting industry. Advanced-Data Science techniques using a digital twin amplified signal responses and analyzed multiple process parameters to predict the occurrence of an incident ahead of time. By gaining order of magnitude improvement in response time, it was possible to move the process fault from an unavoidable to an avoidable and actionable category.

With the context provided above, it is possible to choose a modeling approach and customize the solutions to suit the problem landscape:

data analytics in process system maintenance

Different approaches to Data Analytics

Domain-driven solution
First-principles and the rule-based approach is an example of a domain-driven solution. Traditional ways of delivering solutions for manufacturing often involve computationally intensive solutions (such as process simulation, modeling, and optimization). In one of the difficult-to-model plants, deployment was done using rule engines that allow domain knowledge and experience to determine patterns and cause-effect relationships. Alarms were triggered and advisories/recommendations were sent to the concerned stakeholders regarding what specific actions to undertake each time the model identified an impending event.

Domain-driven approaches also come in handy in the case of ‘cold start’ where solutions need to be deployed with little or no data availability. In some deployments in the mechanical domain, the first-principles approach helped identify >85% of the process faults even at the start of operations.

Pure data-driven solutions
A recent trend seen in the process industry is the move away from domain-driven solutions due to challenges in getting the right skills to deploy solutions, computation infrastructure requirements, customized maintenance solutions, and the requirement to provide real-time recommendations. Complex systems such as naphtha cracking, alumina smelting which are hard to model have harnessed the power of data science to not just diagnose process faults but also enhance response time and bring more finesse to the solutions.

In some cases, domain-driven tools have provided high levels of accuracy in analyzing faults. One such case was related to compressor faults where domain data was used to classify them based on a loose bearing, defective blade, or polymer deposit in the turbine subsystems. Each of these faults was identified using sensor signatures and patterns associated with it. Besides getting to the root cause, this also helped prescribe action to move the compressor system away from anomalous operation.

These solutions need to bear in mind that the operating envelope and data availability covers all possible scenarios. The poor success of deployments using this approach is largely due to insufficient data that covers plant operations and maintenance. However, the number of players offering a purely data-driven solution is large and soon replacing what was traditionally part of a domain engineer’s playbook.

Blended solutions
Blended solutions for the maintenance of process systems combine the understanding of both data science and domain. One such project was in the real-time monitoring and preventive maintenance of >1200 HVAC units across a large geographic area. The domain rules were used to detect and diagnose faults and also identify operating scenarios to improve the reliability of the solutions. A good understanding of the domain helps in isolating multiple anomalies, reducing false positives, suggesting the right prescriptions, and more importantly, in the interpretability of the data-driven solutions.

The differentiation comes from using the combined intelligence from AI / ML models, domain knowledge and knowledge of deployment success are integrated into the model framework.

Customizing the toolkit and determining the appropriate modeling approach are critical to delivery. Given the uniqueness of each plant and problem and the requirement for a high degree of customization, makes the deployment of solutions in a manufacturing environment is fairly challenging. This fact is validated by the limited number of solution providers serving this space. However, the complexity and nature of the landscape need to be well understood by both the client and the service provider. It is important to note that not all problems in the maintenance space are ‘big data’ problems requiring analysis in real-time, using high-frequency data. Some faults with long propagation times can use values averaged over a period of time while other systems with short response time requirements may require real-time data. Where maintenance logs and annotations related to each event (and corrective action) are recorded, one could go with a supervised learning approach, but this is not always possible. In cases where data on faults and anomalies are not available, a one-class approach to classify the operation into normal/abnormal modes has also been used. Solution maturity improves with more data and failure modes identified over time.

A staged solution approach helps in bringing in the right level of complexity to deliver solutions that evolve over time. Needless to say, it takes a lot of experience and prowess to marry the generalized understanding with the customization that each solution demands.

Edge/IoT

A fair amount of investment needs to be made at the beginning of the project to understand the hardware and solution architecture required for successful deployment. While the security of data is a primary consideration, other factors such as computational power, cost, time, response time, open/closed-loop architecture are added considerations in determining the solution framework. Experience and knowledge help understand additional sensing requirements and sensor placement, performance enhancement through edge/cloud-based solutions, data privacy, synchronicity with other process systems, and much more.

By far, the largest challenge is witnessed on the data front (sparse, scattered, unclean, disorganized, unstructured, not digitized, and so on) that prevent businesses from seeing quick success. Digitization and creating data repositories, which set the foundation for model development, take a lot of time.

There is also a multitude of control systems, specialized infrastructure, legacy systems within the same manufacturing complex that one may need to work through. End-to-end delivery with the front-end complexity in data management creates a significant entry barrier for service providers in the maintenance space.

Maintenance cuts across multiple layers of a process system. The maintenance solutions vary as one moves from a sensor to a control loop, equipment with multiple control valves all the way to a flowsheet/enterprise layer. Maintenance across these layers requires a deep understanding of both the hardware as well as process aspects, a combination that is often hard to put together. Sensors and control valves are typically maintained by those with an Instrumentation background, while equipment maintenance could fall in a mechanical or chemical engineer’s domain. On the other hand, process anomalies that could have a plant-level impact are often in the domain of operations/technology experts or process engineers.

Data Science facilitates the development of insights and generalizations required to build understanding around a complex topic like maintenance. It helps in the generalization and translation of learnings across layers within the process systems from sensors all the way to enterprise and other industry domains as well. It is a matter of time before analytics-driven solutions that help maintain safe and reliable operations become an integral part of plant operations and maintenance systems. We need to aim towards the successes that we witness in the medical diagnostics domain where intelligent machines are capable of detecting and diagnosing anomalies. We hope that similar analytics solutions will go a long way to keep plants safe, reduce downtime and provide the best of operations efficiencies that a sustainable world demands.

Today, the barriers to success are in the ability to develop, a clear understanding of the problem landscape, plan end-to-end and deliver customized solutions that take into account business priorities and ROI. Achieving success at a large scale will demand reducing the level of customization required in each deployment – a constraint that is overcome by few subject matter experts in the area today.

The post Data Science Strategies for Effective Process System Maintenance appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/feed/ 1