Artificial Intelligence Archives - Tiger Analytics Thu, 16 Jan 2025 07:24:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.1 https://www.tigeranalytics.com/wp-content/uploads/2023/09/favicon-Tiger-Analytics_-150x150.png Artificial Intelligence Archives - Tiger Analytics 32 32 A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics https://www.tigeranalytics.com/perspectives/blog/a-pharma-leaders-guide-to-driving-effective-drug-launches-with-commercial-analytics/ Wed, 10 Jan 2024 10:16:59 +0000 https://www.tigeranalytics.com/?post_type=blog&p=19499 Learn how pharma leaders can leverage Tiger Analytics’ Commercial Analytics engine to successfully launch new drugs in the market through enhanced data-driven insights and decision-making.

The post A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics appeared first on Tiger Analytics.

]]>
For a Pharmaceutical company, launching a drug represents the culmination of extensive research and development efforts. Across the typical stages of drug launch – planning the launch, the launch itself, and the post-launch drug lifecycle management, Data Analytics can guide pharmaceutical companies to leverage the power of data-driven insights and strategic analysis. How does this help? According to research, for 85% of pharmaceutical launches, the product trajectory is set in the first six months.

Real-time analytics enables informed decision-making, enhanced patient outcomes, and creates a competitive edge for the drug in the ever-evolving Healthcare industry. A data-driven approach across the drug lifecycle ensures that the drug launch is not just a milestone, but a stepping stone towards improved healthcare and a brighter future.

5 Benefits of a Data-Driven Drug Launch

How can Pharma leaders benefit from a data-driven launch? We’ve put together a few of our observations here:

5 Benefits of a Data-Driven Drug Launch

1. Precise Patient Targeting
Begin by identifying the most promising patient segments through comprehensive data analysis. By integrating electronic health records, prescription data, and demographic information, you can pinpoint the specific patient populations that will benefit most from your drug. Tailor your messaging and outreach to address their unique needs and preferences.

2. Segmented Marketing Strategies
Develop personalized marketing strategies for each identified patient segment. Utilize commercial analytics to understand the distinct characteristics of these segments and create tailored campaigns that resonate with their concerns. This approach enhances engagement and encourages a deeper connection between patients and your product.

3. Tactical Pricing Optimization
Determine the optimal pricing strategy for your drug by analyzing market dynamics, competitor pricing, and patient affordability. Commercial analytics helps strike the right balance between maximizing revenue and ensuring accessibility. Data-driven pricing decisions also enhance negotiations with payers and reimbursement discussions.

4. Multi-channel Engagement
Leverage commercial analytics to identify the most effective communication channels for reaching healthcare professionals and patients. Analyze historical prescription patterns and physician preferences to allocate resources to the channels that yield the highest impact. This approach ensures that your message reaches the right stakeholders at the right time.

5. Continuous Performance Monitoring
The launch doesn’t end on the day of the launch — it’s a continuous process. Utilize real-time data analytics to monitor your drug’s performance in the market. Track metrics such as prescription volume, market share, and patient feedback. This information helps you adapt your strategies as needed and capitalize on emerging opportunities.

Enabling a 360-Degree View of Pharma Drug Launch with Commercial Analytics

At Tiger Analytics, we developed a Data Analytics solution, tailored to meet the specific requirements of our clients in the Pharmaceutical industry. Our Commercial Analytics engine powers a suite of data-driven analytical interventions throughout the lifecycle of a drug. It serves as a bridge between goals and actionable insights, effectively transforming raw data into strategic decisions. The solution supports pre-launch patient segmentation and provider engagement. It also aids in launch-stage payer analytics and pharmacy optimization. Lastly, it enables post-launch patient journey analysis and outcomes assessment – giving Pharma leaders a 360-degree view of the entire launch cycle.

Here’s how it works:

Pre-Launch: Setting the Stage for Success

In this stage, the goal is to lay a strong foundation for success by developing the value proposition of the drug. Clinical teams, data strategists, and market researchers collaborate to assess the drug’s commercial potential and create a strategy to realize it. To begin, comprehensive surveys and market research are conducted to gain insights into healthcare personnel (HCP) behavior, competitor analysis, patient profiles, packaging analysis, price comparison, and sales benchmarks. These analyses shape the roadmap for the drug’s performance and enable the exploration of various scenarios through forecasting exercises. Patient profiling and segmentation strategies are also implemented to devise effective marketing and engagement strategies.

From Action to Impact

To drive tangible results, at Tiger Analytics we orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Conducted a comprehensive analysis of analog drugs in the market and performed market scoping along with other forecasting exercises to understand the potential impact of the new drug once launched.
  • Analyzed survey results and developed a tool to assess the possible effectiveness of the drug in real-world scenarios.
  • Formulated multiple scenario analyses to account for unprecedented events and their potential impact on the demand for the drug.

How did the solutions help?

  • Provided a clear view of the expected market landscape through market sizing.
  • Prepared the pharma company for unknown events through scenario analysis.
  • Facilitated target adjustment and improved planning by forecasting numbers.

How did the solutions help?

Launch: Strategic Payer Engagement in a Complex Landscape

During the drug launch, the focus shifts to accelerating drug adoption and reducing the time it takes to reach peak sales. At this juncture, analytics plays a crucial role in optimizing market access and stakeholder engagement (payers, prescribers, and patients). By analyzing payer data, claims information, and reimbursement policies, pharmaceutical companies gain insights for strategic decision-making, including formulary inclusion, pricing strategies, and reimbursement trends. These insights enable effective negotiations with payers, ensuring optimal coverage and patient access to the medication.

Monitoring sales and identifying early and late adopters among HCPs and patients enables targeted marketing activities and tailored promotional initiatives. This approach effectively propelled successful market penetration.

From Action to Impact

To drive tangible results, we, at Tiger Analytics, orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Implemented a robust email marketing campaign, targeting the identified early adopter HCPs.
  • Monitored HCP engagement and response to emails using advanced analytics and tracking tools.
  • Leveraged predictive models to conduct real-time analysis of promotional activities, optimizing their effectiveness and making data-driven adjustments.

How did the solutions help?

  • Achieved a 15% increase in HCP engagement and response rates.
  • Real-time analysis led to a 10% improvement in effectiveness.

How did the solutions help?

Post-Launch: Empowering Patient-Centric Care

Post-launch analytics focuses on monitoring the market and adapting to market dynamics (competition, regulations, reimbursements, etc.) to extend the drug’s lifecycle. Advanced analytics also enables understanding a patient’s journey and optimizing the person’s medication adherence. By leveraging real-world data, electronic health records, and patient-reported outcomes, pharmaceutical companies gain invaluable insights into patient behavior, adherence rates, and treatment patterns. These insights facilitate the development of personalized interventions, patient support programs, and targeted educational campaigns to enhance patient adherence and improve treatment outcomes. Additionally, continuous tracking of the medication’s performance, market share, and patient-reported outcomes enables pharmaceutical companies to make data-driven decisions, generate evidence for stakeholders, and drive ongoing improvements in patient care.

From Action to Impact

To drive tangible results, we, at Tiger Analytics, orchestrated a series of targeted initiatives with specific outcomes:

What did we do?

  • Utilized real-world data and electronic health records to track patient behavior and medication adherence.
  • Conducted in-depth analysis of patient-reported outcomes to gain insights into treatment patterns and efficacy.
  • Developed personalized interventions and patient support programs, based on the identified patterns and behaviors.

How did the solutions help?

  • Improved medication adherence by 25%.
  • Achieved a 30% increase in patient satisfaction and treatment compliance.

How did the solutions help?
For Pharmaceutical companies, the goal of a successful drug launch is not only about accelerating the medicine’s time to market, but it is also about ensuring patient awareness and access to life-saving drugs. By leveraging the power of data to fuel AI-enabled drug launches, we’ll continue to see better medication adherance, satisfied patients, compliance to treatments – which will ultimately lead to better health outcomes.

The post A Pharma Leader’s Guide to Driving Effective Drug Launches with Commercial Analytics appeared first on Tiger Analytics.

]]>
Why India-Targeted AI Matters: Exploring Opportunities and Challenges https://www.tigeranalytics.com/perspectives/blog/need-india-centric-ai/ Wed, 11 May 2022 13:42:19 +0000 https://www.tigeranalytics.com/?p=7604 The scope for AI-focused innovation is tremendous, given India’s status as one of the fastest-growing economies with the second-largest population globally. Explore the challenges and opportunities for AI in India.

The post Why India-Targeted AI Matters: Exploring Opportunities and Challenges appeared first on Tiger Analytics.

]]>
To understand the likely impact of India-centric AI, one needs to appreciate the country’s linguistic, cultural, and political diversity. Historically, India’s DNA has been so heterogeneous that extracting clear perspectives and actionable insights to address past issues, current challenges, and moving towards our vision as a country would be impossible without harnessing the power of AI.

The scope for AI-focused innovation is tremendous, given India’s status as one of the fastest-growing economies with the second-largest population globally. India’s digitization journey and the introduction of the Aadhaar system in 2010 – the largest biometric identity project in the world – has opened up new venues for AI and data analytics. The interlinking of Aadhaar with banking systems, the PDS, and several other transaction systems allows greater visibility, insights, and metrics that can be used to bring about improvements. Besides using these to raise the quality of lives of citizens while alleviating disparities, AI can support more proactive planning and formulation of policies and roadmaps. Industry experts concur a trigger and economic growth spurt, opining that “AI can help create almost 20 million jobs in India by 2025 and add up to $957 billion to the Indian economy by 2035.”

The current state of AI in India

The Indian government, having recently announced the “AI for All” strategy, is more driven than ever to nurture core AI skills to future-proof the workforce. This self-learning program looks to raise awareness levels about AI for every Indian citizen, be it a school student or a senior citizen. It targets meeting the demands of a rapidly emerging job market and presenting opportunities to reimagine how industries like farming, healthcare, banking, education, etc., can use technology. A few years prior, in 2018, the government had also increased its funding towards research, training, and skilling in emerging technologies by 100% as compared to 2017.

The booming interest has been reflected in the mushrooming of boutique start-ups across the country, as well. With a combined value of $555 million, it is more than double the previous year’s figure of $215 million. Interestingly, analytics-driven products and services contribute a little over 64% of this market -clocking over $355 million. In parallel, the larger enterprises are taking quantum leaps to deliver AI solutions too. Understandably, a large number of them use AI solutions to improve efficiency, scalability, and security across their existing products and services.

Current challenges of making India-centric AI

There is no doubt that AI is a catalyst for societal progress through digital inclusion. And in a country as diverse as India, this can set the country on an accelerated journey toward socio-economic progress. However, the socio, linguistic and political diversity that is India also means more complex data models that can be gainfully deployed within this landscape. For example, NLP models would have to adapt to text/language changes within just a span of a few miles! And this is just the tip of the iceberg as far as the challenges are concerned.

Let’s look at a few of them:

  • The deployment and usage of AI have been (and continues to be) severely fragmented without a transparent roadmap or clear KPIs to measure success. One of the reasons is the lack of a governing body or a panel of experts to regulate, oversee and track the implementation of socio-economic AI projects at a national level. But there’s no avoiding this challenge, considering that the implications of AI policy-making on Indian societies may be irreversible.
  • The demand-supply divide in India for AI skills is huge. The government initiatives such as Startup India as well as the boom in AI-focused startups have only contributed to extending this divide. The pace of getting a trained workforce to cater to the needs of the industry is accelerating but unable to keep up with the growth trajectory that the industry finds itself in. Large, traditionally run institutions are also embracing AI-driven practices having witnessed the competitive advantage it brings to the businesses. This has added to the scarcity that one faces in finding good quality talent to serve today’s demand.
  • The lack of data maturity is a serious roadblock on the path to establishing India-centric AI initiatives – especially with quite a few region-focused datasets being currently unavailable. There is also a parity issue with quite a few industry giants having access to large amounts of data as compared to the government, let alone start-ups. There is also the added challenge of data quality and a single source of truth that one can use for AI model development
  • Even the fiercest AI advocates would admit that its security challenges are nowhere close to being resolved. There is a need for security and compliance governance protocols to be region-specific so that unique requirements are met and yet there is a generalisability that is required to rationalize these models at the national level.
  • There is also a lot of ongoing debate at a global level on defining the boundaries that ethical AI practices will need to lean on. Given India’s diversity, this is a challenge that is magnified many times over

Niche areas where AI is making an impact

Farming

The role of AI in modern agricultural practices has been transformational – this is significant given that more than half the population of India depends on farming to earn a living. In 2019-2020 alone, over $1 billion was raised to fuel agriculture-food tech start-ups in India. It has helped farmers generate steadier income by managing healthier crops, reducing the damage caused by pests, tracking soil and crop conditions, improving the supply chain, eliminating unsafe or repetitive manual labor, and more.

Healthcare

Indian healthcare systems come with their own set of challenges – from accessibility and availability to quality and poor awareness levels. But each one represents a window of opportunity for AI to be a harbinger of change. For instance, AI-enabled platforms can extend healthcare services to low-income or rural areas, train doctors and nurses, address communication gaps between patients and clinicians, etc. Government-funded projects like NITI Aayog and the National Digital Health Blueprint have also highlighted the need for digital transformation in the healthcare system.

BFSI

The pandemic has accelerated the impact of AI on the BFSI industry in India, with several key processes undergoing digital transformation. The mandatory push for contactless remote banking experience has infused a new culture of innovation in mission-critical back-end and front-end operations. A recent PwC-FICCI survey showed that the banking industry has the country’s highest AI maturity index – leading to the deployment of the top AI use cases. The survey also predicted that Indian banks would see “potential cost savings up to $447 billion by 2023.”

E-commerce

The Indian e-commerce industry has already witnessed big numbers thanks to AI-based strategies, particularly marketing. For retail brands, capturing market share is among the toughest worldwide – with customer behavior being driven by a diverse set of values and expectations. By using AI and ML technologies – backed by data science – it would be easier to tap into multiple demographics without losing the context of messaging.

Manufacturing

Traditionally, the manufacturing industry has been running with expensive and time-consuming manually driven processes. Slowly, more companies realize the impact of AI-powered automation on manufacturing use cases like assembly line production, inventory management, testing and quality assurance, etc. While still at a nascent stage, AR and VR technologies are also seeing adoption in this sector in use cases like prototyping and troubleshooting.

3 crucial data milestones to achieve in India’s AI journey

1) Unbiased data distribution

Forming India-centric datasets starts with a unified framework across the country so that no region is left uncovered. This framework needs to integrate with other systems/data repositories in a secure and seamless manner. Even private companies can share relevant datasets with government institutions to facilitate strategy and policy-making.

2) Localized data ownership

In today’s high-risk data landscape, transferring ownership of India-centric information to companies in other countries can lead to compliance and regulatory problems. Especially when dealing with industries with healthcare or public administration, it is highly advised to maintain data control within the country’s borders.

3) Data ethics and privacy

Data-centric solutions that work towards improving human lives require a thorough understanding of personal and non-personal data, matters of privacy, and infringement among others. The responsible aspect to manage this information takes the challenges beyond the realms of deployment of a mathematical solution. Building an AI mindset that raises difficult questions about ethics, policy, and law, and ensures sustainable solutions with minimized risks and negative impact is key. Plus, data privacy should continue to be a hot button topic, with an uncompromising stance on safeguarding the personal information of Indian citizens.

Final thoughts

India faces a catch-22 situation with one side of the country still holding to its age-old traditions and practices. The other side embraces technology change, be it using UPI transfers, QR codes, or even the Aarogya Setu app. But sheer size and diversity of languages, cultures, and politics dictate that AI will neither fail to find areas to cause a profound impact nor face fewer challenges while implementing it.

As mentioned earlier, the thriving startup growth adds a lot of fuel to AI’s momentum. From just 10 unicorns in India in 2018, we have grown to 38. This number is expected to increase to 62 by 2025. In 2020, AI-based Indian startups received over $835 million in funding and are propelling growth few countries can compete with. AI is a key vehicle to ring in the dawn of a new era for India-centric AI– an India which despite the diversity and complex landscape, leads the way in the effective adoption of AI.

This article was first published in Analytics India Magazine.

The post Why India-Targeted AI Matters: Exploring Opportunities and Challenges appeared first on Tiger Analytics.

]]>
Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/ https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/#comments Thu, 24 Mar 2022 12:43:31 +0000 https://www.tigeranalytics.com/?p=7314 The presence of legacy systems, regulatory compliance issues and sudden growth of the BEV/PHEV market are all challenges the automotive industry must face. Explore how Analytics can help future-proof their growth plans.

The post Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market appeared first on Tiger Analytics.

]]>
In an age when data dictates decision-making, from cubicles to boardrooms, many auto dealers worldwide continue to draw insights from past experiences. However, the automotive market is ripe with opportunities to leverage data science to improve operational efficiency, workforce productivity, and consequently – customer loyalty.

Data challenges faced by automotive dealers

There are many reasons why auto dealers still struggle to collect and use data. The biggest one is the presence of legacy systems that bring entangled processes with disparate data touchpoints. This makes it difficult to consolidate information and extract clean, structured data – especially when there are multiple repositories. More importantly, they are unable to derive and harness actionable insights to improve their decision-making capabilities, instead of merely relying on gut instincts.

In addition, the sudden growth of the BEV/PHEV market has proven to complicate matters – with increasing pressure on regulatory compliance.

But the reality is that future-ready data management is a must-have strategy – not just to thrive but even to survive today’s automotive market. The OEMs are applying market pressure on one side of the spectrum – expecting more cost-effective vehicle pricing models to establish footprints in smaller or hyper-competitive markets. On the other side, modern customers are making it abundantly clear that they will no longer tolerate broken, inefficient, or repetitive experiences. And if you have brands operating in different parts of the world, data management can be a nightmarishly time-consuming and complex journey.

Future-proofing the data management strategy

Now, it’s easier said than done for the automotive players to go all-in on adopting a company-wide data mindset. It is pertinent to create an incremental data-driven approach to digital transformation that looks to modernize in phases. Walking away from legacy systems with entangled databases means that you must be assured of hassle-free deployment and scalability. It can greatly help to prioritize which markets/OEMs/geographies you want to target first, with data science by your side.

Hence, the initial step is to assess the current gaps and challenges to have a clear picture of what needs to be fixed on priority and where to go from thereon. Another key step in the early phase should be to bring in the right skill sets to build a future-proofed infrastructure and start streamlining the overall flow of data.

It is also important to establish a CoE model to globalize data management from day zero. In the process, a scalable data pipeline should be built to consolidate information from all touchpoints across all markets and geographies. This is a practical way to ensure that you have an integrated source of truth that churns out actionable insights based on clean data.

You also need to create a roadmap so that key use cases can be detected with specific markets identified for initial deployment. But first, you must be aware of the measurable benefits that can be unlocked by tapping into the power of data.

  • Better lead scoring: Identify the leads most likely to purchase a vehicle and ensure targeted messaging.
  • Smarter churn prediction: Identify aftersales customers with high churn propensity and send tactical offers.
  • Accurate demand forecasting: Reduce inventory days, avoid out-of-stock items, and minimize promotional costs.
  • After-sales engagement: Engage customers even after the initial servicing warranty is over regarding repairs, upgrades, etc. as well an effective parts pricing strategy.
  • Sales promo assessment: Analyze historical sales data, seasonality/trends, competitors, etc., to recommend the best-fit promo.
  • Personalized customer engagement: Customize interactions with customers based on data-rich actionable intelligence instead of unreliable human instincts.

How we helped Inchcape disrupt the automotive industry

When Tiger Analytics began the journey with Inchcape, a leading global automotive distributor, we knew that it was going to disrupt how the industry tapped into data. Fast-forward to a year later, we were thrilled to recently take home Microsoft’s ‘Partner of the Year 2021’ award in the Data & AI category. What started as a small-scale project grew into one of the largest APAC-based AI and Advanced Analytics projects. We believe that this project has been a milestone moment for the automotive industry at large. If you’re interested in finding out how our approach raised the bar in a market notorious for low data adoption, please read our full case study.

The post Data-Driven Disruption? How Analytics is Shifting Gears in the Auto Market appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/data-analytics-led-disruption-boon-automotive-market/feed/ 1
When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/ https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/#comments Thu, 27 Jan 2022 10:26:37 +0000 https://www.tigeranalytics.com/?p=6933 The BPO industry has embraced analytics to optimize profitability, efficiency, and customer satisfaction. This blog delves into the specifics of data utilization, unique challenges, and key business areas where analytics can make a difference.

The post When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry appeared first on Tiger Analytics.

]]>
Around 1981, the term outsourcing entered our lexicons. Two decades later, we had the BPO boom in India, China, and the Philippines with every street corner magically sprouting call centers. Now, in 2022, the industry is transitioning into an era of analytics, aiming to harness its sea of data for profitability, efficiency, and improved customer experience.

In this blog, we delve into details of what this data is, the unique challenges it poses, and the key business areas that can benefit from the use of analytics. We also share our experiences in developing these tools and how they have helped our clients in the BPO industry.

The Information Ocean

The interaction between BPO agents and customers generates huge volumes of both structured and unstructured (text, audio) data. On the one hand, you have the call data that measures metrics such as the number of incoming calls, time taken to address issues, service levels, and the ratio of handled vs abandoned calls. On the other hand, you have customer data measuring satisfaction levels and sentiment.

Insights from this data can help deliver significant value for your business whether it’s around more call resolution, reduced call time & volume, agent & customer satisfaction, operational cost reduction, growth opportunities through cross-selling & upselling, or increased customer delight.
The trick is to find the balance between demand (customer calls) and supply (agents). An imbalance can often lead to revenue losses and inefficient costs and this is a dynamic that needs to be facilitated by processes and technology.

Challenges of Handling Data

When you are handling such sheer volumes of data, the challenges too can be myriad.
Our clients wage a daily battle with managing these vast volumes, harmonizing internal and external data, and driving value through them. For those that have already embarked on their analytical journey, the primary goals are finding the relevance of what they built, driving scalability, and leveraging new-age predictive tools to drive ROI.

Delivering Business Value

Based on our experience, the business value delivered from advanced Analytics in the BPO industry is unquestionable, exhaustive and primarily influences these key aspects:

1) Call Management

Planning agent resources based on demand (peak and off-peak) and skillsets accounting for how long they take to resolve issues can impact business costs. AI can help automate the process to help optimize costs We have built an automated and real-time scheduling and resource optimization tool that has led one of our BPO clients to a cost reduction of 15%.

2) Customer Experience

Call center analytics give agents access to critical data and insights to work faster and smarter, improve customer relationships and drive growth. Analytics can help understand the past behavior of a customer/similar customers and recommend products or services that will be most relevant, instead of generic offers. It can also predict which customers are likely to need proactive management. Our real-time cross-selling analytics has led to a 20% increase in revenue.

3) Issue Resolution

First-call resolution refers to the percentage of cases that are resolved during the first call between the customer and the call center. Analytics can help automate the categorization process of contact center data by building a predictive model. This can help with a better customer servicing model achieved by appropriately capturing the nuances of customer chats with contact centers. This metric is extremely important as it helps in reducing the customer churn rate.

4) Agent Performance

Analytics on call-center agents can assist in segmenting those who had a low-resolution rate or were spending too much time on minor issues, compared with top-performing agents. This helps the call center resolve gaps or systemic issues, identify agents with leadership potential, and create a developmental plan to reduce attrition and increase productivity.

5) Call Routing

Analytics-based call routing is based on the premise that records of a customer’s call history or demographic profile can provide insight into which call center agent(s) has the right personality, conversational style, or combination of other soft skills to best meet their needs.

6) Speech Analytics

Detecting trends in customer interactions and analyzing audio patterns to read emotions and stress in a speaker’s voice can help reduce customer churn, boost contact center productivity, improve agent performance and reduce costs by 25%. Our tools have clients in predicting member dissatisfaction to achieve a 10% reduction in first complaints and 20% reduction in repeat complaints.

7) Chatbots and Automation

Thanks to the wonders of automation, we can now enhance the user experience to provide personalized attention to customers available 24/7/365. Reduced average call duration and wage costs improve profitability. Self-service channels such as the help center, FAQ page, and customer portals empower customers to resolve simple issues on their own while deflecting more cases for the company. Our AI-enabled chatbots helped in strengthening engagement and quicker resolutions of 80% of user queries.

Lessons from The Philippines

Recently, in collaboration with Microsoft, we conducted a six-week Data & Analytics Assessment for a technology-enabled outsourcing firm in the Philippines. The client was encumbered by complex ETL processes, resource bottlenecks on legacy servers, and a lack of UI for troubleshooting leading to delays in resolution and latency issues. They engaged Tiger Analytics to assess their data landscape.

We recommended an Enterprise Data Warehouse modernization approach to deliver improved scalability & elasticity, strengthened data governance & security, and improved operational efficiency.

We did an in-depth assessment to understand the client’s ecosystem, key challenges faced, data sources, and their current state architecture. Through interactions with IT and business stakeholders, we built a roadmap for a future state data infrastructure that would enable efficiency, scalability, and modernization. We also built a strategic roadmap of 20+ analytics use cases with potential ROI across HR and contact center functions.

The New Era

Today, the Philippines has been recognized as the BPO capital of the world. The competition will toughen both from new players and existing ones. A digital transformation is underway in the BPO industry. Success in this competitive space lies with companies that will harness the huge volume of data they have into meaningful and actionable change.

The post When Opportunity Calls: Unlocking the Power of Analytics in the BPO Industry appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/role-data-analytics-bpo-industry/feed/ 187
Consulting with Integrity: ‘Responsible AI’ Principles for Consultants https://www.tigeranalytics.com/perspectives/blog/consulting-with-integrity-responsible-ai-principles-for-consultants/ Wed, 05 Jan 2022 15:22:35 +0000 https://www.tigeranalytics.com/?post_type=blog&p=19156 Third-party AI consulting firms engaged in multiple stages of AI development must point out any ethical red flags to their clients at the right time. This article delves into the importance of a structured ethical AI development process.

The post Consulting with Integrity: ‘Responsible AI’ Principles for Consultants appeared first on Tiger Analytics.

]]>
AI goes rogue and decimates or enslaves humanity — the internet is full of such horrendous fictional movies. The fictional AI risk may be far-fetched, but the current state of Narrow AI could soon have a profound impact on humanity. AI developers and leaders around the world have an ethical obligation toward society. They have a responsibility to create a system suited for the benefit of society and the environment surrounding it.

AI could go wrong in many ways and have unintended consequences in the shorter or longer term. In a certain case, an AI algorithm was found to unintentionally reinforce racial bias when it predicted lower health risk scores for people of color. It turned out that the algorithm was using patients’ historical healthcare spending to model future health risks. As this bias perpetuates through the algorithm in operation, it becomes like a disastrous self-fulfilling prophecy leading to healthcare disparity.

In another incident, Microsoft had to bear the brunt when Tay — its millennial chatbot — engaged in trash talk on social media and had to be taken offline within 16 hours of going live.

Only the juiciest stories make it to the front page of news, but the ethical conundrum runs deep for any organization building AI-driven applications. Leading organizations have concurred on the very core principles for the ethical development of AI — Fairness, Safety, Privacy, Security, Interpretability, and Inclusiveness. Numerous product-led companies champion the need for a responsible AI with a human-centric approach. But, these products are not built entirely by a single team. Many a time, the use of multiple pre-packaged software brings the AI use case to fruition. In some other cases, it involves specialized AI consulting companies to bring in bespoke solutions, capabilities, datasets, or skill sets to complement the speed and scale of AI development.

As third-party AI consulting firms are involved in the various phases of AI development — data gathering and data wrangling, model training, and building, and finally, model deployment and adoption — it is crucial for them to understand the reputational implications of even a mildly rouge AI for their clients. Without certain systems in place, AI development teams scramble to solve the issues as they come, brewing a regulatory and humanitarian storm. In such a situation, it is imperative for these consulting or vendor organizations to follow a certain process for ethical AI development. Some of the salient points of such a process should be:

1. Recognize and flag an AI ethical issue early.

We can solve ethical dilemmas only if we have the mechanisms to recognize one. A key step at the beginning of any AI ethical quandary is locating and isolating ethical aspects of the issue. This involves educating the employees and consultants alike toward AI ethics sensitivity. Experienced data modelers in the team should have the eye to identify any violations of the core ethical principles in any of their custom-made solutions.

2. Documentation helps you trace unethical behavior.

Documenting how the key AI services operate, are trained, their performance metrics, fairness, robustness, and their systemic biases goes a long way in avoiding ethical digression. The devil is in the details, and the details are captured better by documentation.

3. Work in tandem with the client’s team to understand business-specific ethical risks within AI.

Similar industries share a theme across their AI risks. A healthcare or banking company must build extra guard rails around probable violations of privacy and security. E-commerce companies, pioneers in creating state-of-the-art recommendation engines, must keep their ears and eyes open to mitigate any kind of associative bias leading to stereotypical associations within certain populations. Identifying such risks narrows the search for probable violations.

4. Use an ethical framework like the Consequentialist Framework for an objective assessment of ethical decision-making.

 A consequential framework evaluates an AI project by looking at its outcomes. Such frameworks help teams meditate over probable ethical implications. For example, a self-driving AI that has even a remote possibility of being unable to recognize pedestrians covered with facemasks could be fatal and shouldn’t ever make it to the markets.

5. Understand the trade-off between accuracy, privacy, and bias at different stages of model evaluation.

Data scientists must be cognizant of the fact that their ML models should be optimized not only for best performance and high accuracy but also for lower (unwanted) bias. Like any other non-binary decision, leaders should be aware of this trade-off too. Fairness metrics and bias mitigation tool kits like the IBM’s AI Fairness 360 could be used to mitigate unwanted bias in datasets and models.

6. Incentivize open-source and white-box approaches.

An open-source and explainable AI approach is crucial in establishing trust between vendors and clients. It ensures that the system is working as expected and any anomalies can be backtracked to the precise code or data item that might have originated it. Ease of regulatory compliance with open-source approaches makes it a favorite in the financial services and healthcare sector.

7. Run organizational awareness initiatives.

An excellent data scientist may not be aware of the ethical implication of autonomous systems. Organizational awareness, adequate training, and a robust mechanism to bring forth any AI risks should be inculcated into culture and values. Employees should be incentivized to escalate the tiniest of such situations. An AI ethics committee should be formed to provide broader guidance to on-ground teams regarding grey areas.

Final Thoughts

A successful foundation to each of these steps is smooth coordination and handshake between vendor and client teams with a responsible common vision. Vendors should not hesitate to bring forth any AI ethical risks that they might be running for their clients. Clients, meanwhile, should involve their strategic vendors in such discussions and training. Whistleblowers for AI ethical risks might be analysts and data scientists, yet it won’t be possible for them to flag those issues unless there is a top-down culture that encourages them to look for it.

The post Consulting with Integrity: ‘Responsible AI’ Principles for Consultants appeared first on Tiger Analytics.

]]>
Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry https://www.tigeranalytics.com/perspectives/blog/advanced-analytics-ai-telecom/ Thu, 23 Dec 2021 11:49:32 +0000 https://www.tigeranalytics.com/?p=5958 Explore how Analytics is helping the Telecom industry uncover growth opportunities for customer acquisition, while simultaneously growing the value of existing customers.

The post Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry appeared first on Tiger Analytics.

]]>
There is rich and abundant data available in the telecom sector, and this data has been especially relevant in the last two years. Bandwidth consumption reached an all-time high amid the global health crisis, as all businesses and educational institutions moved towards a digital workspace model.

However, despite this shift to digital-first, some key challenges have led to a dip in growth in the sector. These challenges include:

  • Intense pricing competition across the sector from both legacy players and newcomers that are offering unique business models.
  • Increasing adoption of services from OTT providers (Ex: WhatsApp for voice calls, Messenger for messaging, etc.).
  • Raising capital expenditures to set up new infrastructure to provide improved connectivity and 5G services.

In this article, we will discuss the top growth opportunities for the telecom sector in acquiring new customers, while simultaneously growing the value of existing customers:

  • Customer 360-view to enable targeted growth from the existing customer base
  • Customer retention
  • Customer service experience
  • Capitulating on the growth of the B2B segment

Customer 360-view: Why it matters

Customer 360-view, as the name suggests, is about the all-round picture. It provides a comprehensive understanding of customers by aggregating data from all touchpoints. This data traces the customer’s journey across various departments, all on one central platform.

We can further augment internal data sources with structured and unstructured external data sources, such as business profiles, demographic reports, social media data, etc. This rich external data is usually stored in silos, or, unfortunately, never used.

Companies tend to shy away from adopting the Customer 360-view because of the challenges it presents. One common one is the difference in entity names used in various internal systems and third-party data sources. Here is where implementing AI-based string matching algorithms has been helpful in merging multiple disparate sources.

Similar to the example above, solutions can be found for companies struggling to implement the Customer 360-view because its advantages definitely trump the challenges. Let’s look at some of the advantages:

  • Unified view of customers across all departments — from business development to customer support
  • Scalable data that can be processed faster and at a minimal cost
  • Enabling AI and analytics-use cases (not exhaustive) such as:
  • ai_telecom_use-cases_new

  • Accurate external data augmentation has led to better features and thus improved accuracies in predictive models and improved understanding of customer behavior

Customer retention through churn prediction

The cost of customer retention is much lower than the cost of new customer acquisition

The offering of voice, messaging, mobile music, and video services by OTT providers such as WhatsApp, Messenger, Netflix, and Spotify, etc., have made data the primary offering for telecom companies.

Customers are spoilt for choice due to the ongoing price wars and data-heavy plans with competitive pricing. While the basic product is the same or with very few differences, the competition is high and the options plenty. This has led to an increase in customer churn.

Hence, it is crucial for telecom companies to understand the reasons for this customer churn, and predict paths that lead to an increase in customer churn.

One way to go about this is via machine learning models that are able to predict customer churn. This can be done using customers’ past transactions, network quality, product pricing, product usage, customer feedback history, complaints log, demographics, and social media data (if any).

Targeting the right customers to carry out retention campaigns is key. Those picked will be directly related to the campaign budget, cost of retention of each customer, and the incremental revenue generated through each customer.

This process is especially important because even retaining a small percentage of the customers who are about to churn can lead to increased revenue impact in the long run.

Customer service transformation

If the products being offered are similar and the competition is high, how does one differentiate between telecom operators? The answer is customer service. 

In this digital-first world, there is an increasing demand for the transformation of customer experience and the adoption of new technology, such as AI-enabled chatbots and dialogue systems.

One common challenge is providing the customer with all the right information regarding the product they are about to purchase. Often, customer service officers handle a range of products, and may not be equipped to handle all the customers’ questions. This increases the time customers spend on hold or in queues, which leads to dissatisfaction.

Here is where AI-enabled intelligent customer service systems can reduce waiting time and help in providing the most relevant solutions or recommendations to customers. This can be done in one or more ways:

  • Forecasting inbound call volumes for optimizing short and long-term staffing and training.
  • Employing virtual assistants to provide fast resolutions provide fast resolutions to all straightforward customer queries and redirect the rest to appropriate customer care agents
  • Enabling the representative with a customer 360-view helps them understand the customer query and background without getting a lot of inputs from the customer.
  • Enabling the team with a real-time analytics engine for recommending the right offer/product to an existing customer based on their profile, demographics, and interaction with the agent.

The growth of the telecom B2B segment to be driven by digitalization and 5G

The B2B business model enjoys high margins (compared to B2C), with customers willing to pay more for different services. It is characterized by a highly diverse list of products, customized solutions, pricing, and multiple partners. On the downside, this increases the length of the sales cycle.

One common growth use case (apart from common telecom use cases discussed above), specific to the B2B segment, is reducing the sales cycle time by using AI and analytics in product solution and pricing. This leads to a better customer experience, thus increasing customer acquisition.

The following are the main differences in characteristics of the two segments:

Differences in characteristics of the two segments: B2B and B2C

Differences in characteristics of the two segments: B2B and B2C

Historically, most telecom providers have prioritized analytics use cases to capture growth in the B2C segment¹. However, with the advent of digitalization, all businesses are relying on the telecom industry for reliable high-speed 5G data and corporate mobile plans. It is estimated that by 2035, sales amounting to USD 13.2 trillion will be enabled by the 5G ecosystem².

As a result, the next decade will likely see the B2B segment growing much faster than the B2C segment. Concentrating on B2B use cases will help telecom companies grab a bigger share of the growing market.

Benefits of implementing AI and Advanced Analytics (examples)

To really understand how AI and analytics are helping transform this booming sector, let’s look at some real-world examples.

Customer 360: Data governance system for an Asian OTT video service provider

Problem: The client was looking forward to developing a comprehensive understanding of the user’s program viewing behavior for smarter programming and advertising decisions.

Solution: The solution was to build a data lake to process internal and third-party data from structured and unstructured data sources. Some key challenges included creating a data governance process, handling inconsistencies across multiple sources, and building a flexible system that allows new data sources.

Value delivered: The outcome of the exercise was a data lake that could process 100 GB of data volume daily with varying velocities, ensuring data availability for data analytics projects across multiple themes.

The following are select case studies, executed using Customer 360-view datasets:

Customer 360-view datasets Case study

Customer 360-view datasets Case study

Churn Prediction – User Behavior Prediction Model driving USD 4.5 MM annual revenue

Problem: The client, a telecom giant, wanted to identify customers most likely to churn in their video-on-demand (VOD) business.

Solution: The key challenges were huge data volume, limited metadata on VOD content, constantly changing user base, and limited subscriber demographic information. The solution involved building a random forest-based churn classification model based on the features extracted from past customer RFM behavior, rate of change in purchases month-on-month, demographics, and content metadata.

Value delivered: A total of 73.4 % of potential churners were captured by the top 50% of the population flagged off by the model, leading to revenue retention of up to USD 4.5 MM per annum.

Customer service transformation case studies

Customer service transformation case study

Customer service transformation case study

Telecom B2B – Pricing system for a leading Asian telecom company

Problem: The client was looking to shorten their B2B product sales cycle, which currently took up to 4+ weeks to produce the initial quotation.

Solution: The bottleneck in the process was identified as the involvement of third-party costs and the delay in receiving them. The solution involved building ML models in predicting third-party expenses to reduce the waiting time and provide customers with an initial quote.

Value delivered: The business impact was reduced turnaround time for an initial quote from four weeks to a maximum of one day.

The future is brighter, smarter, quicker

The applications of AI and predictive analytics in the telecom sector are endless. With digital transformation being the key goal for any company today, combining AI and analytics can not only help in delivering superior performance but also give a company that touch of uniqueness needed to survive in a cut-throat market.

For more information on services and use cases, please get in touch with us at https://www.tigeranalytics.com/
References:
1. https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-b2b-analytics-playbook-capturing-unrealized-potential-in-telcos
2. https://www.qualcomm.com/media/documents/files/ihs-5g-economic-impact-study-2019.pdf
The article was first published in Analytics India Magazine- https://analyticsindiamag.com/advanced-analytics-and-ai-in-telecom-notes-from-tiger-analytics/

The post Ringing in the Future: How Advanced Analytics is Transforming the Telecom industry appeared first on Tiger Analytics.

]]>
Data Science Strategies for Effective Process System Maintenance https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/ https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/#comments Mon, 20 Dec 2021 16:42:57 +0000 https://www.tigeranalytics.com/?p=6846 Industry understanding of managing planned maintenance is fairly mature. This article focuses on how Data Science can impact unplanned maintenance, which demands a differentiated approach to build insight and understanding around the process and subsystems.

The post Data Science Strategies for Effective Process System Maintenance appeared first on Tiger Analytics.

]]>
Data Science applications are gaining significant traction in the preventive and predictive maintenance of process systems across industries. A clear mindset shift has made it possible to steer maintenance from using a ‘reactive’ (using a run-to-failure approach) to one that is proactive and preventive in nature.

Planned or scheduled maintenance uses data and experiential knowledge to determine the periodicity of servicing required to maintain the plant components’ good health. These are typically driven by plant maintenance teams or OEMs through maintenance rosters and AMCs. Unplanned maintenance, on the other hand, occurs at random, impacts downtime/production, safety, inventory, customer sentiment besides adding to the cost of maintenance (including labor and material).

Interestingly, statistics reveal that almost 50% of the scheduled maintenance projects are unnecessary and almost a third of them are improperly carried out. Poor maintenance strategies are known to cost organizations as much as 20% of their production capacity – shaving off the benefits that a move from reactive to preventive maintenance approach would provide. Despite years of expertise available in managing maintenance activities, unplanned downtime impacts almost 82% of businesses at least once every three years. Given the significant impact on production capacity, aggregated annual downtime costs for the manufacturing sector are upwards of $50 billion (WSJ) with average hourly costs of unplanned maintenance in the range of $250K.

It is against this backdrop that data-driven solutions need to be developed and deployed. Can Data Science solutions bring about significant improvement of the maintenance domain and prevent any or all of the above costs? Are the solutions scalable? Do they provide an understanding of what went wrong? Can they provide insights into alternative and improved ways to manage planned maintenance activities? Does Data Science help reduce all types of unplanned events or just a select few? These are questions that manufacturers need to be answered and it is for the experts from both maintenance and data science domains to address.

Industry understanding of managing planned maintenance is fairly mature. The highlight of this article is therefore focused on unplanned maintenance, which demands a differentiated approach to build insight and understanding around the process and subsystems.

Data Science solutions are accelerating the industry’s move towards ‘on-demand’ maintenance wherein interventions are made only if and when required. Rather than follow a fixed maintenance schedule, data science tools can now aid plants to increase run lengths between maintenance cycles in addition to improving plant safety and reliability. Besides the direct benefits that result in reduced unplanned downtime and cost of maintenance, operating equipment at higher levels of efficiency improves the overall economics of operation.

The success of this approach was demonstrated in refinery CDU preheat trains that use soft sensing triggers to decide when to process ‘clean crude’ (to mitigate the fouling impact) or schedule maintenance of fouled exchangers. Other successes were in the deployment of plant-wide maintenance of control valves, multiple-effect evaporators in plugging service, compressors in petrochemical service, and a geo-wide network of HVAC systems.

Instead of using a fixed roster for maintenance of PID control valves, plants can now detect and diagnose control valves that are malfunctioning. Additionally, in combination with domain and operations information, it can be used to suggest prescriptive actions such as auto-tuning of the valves, which improve maintenance and operations metrics.

Reducing unplanned, unavoidable events

It is important to bear in mind that not all unplanned events are avoidable. The inability to avoid events could be either because they are not detectable enough or because they are not actionable. The latter could occur either because the response time available is too low or because the knowledge to revert a system to its normal state does not exist. A large number of unplanned events however are avoidable, and the use of data science tools improves their detection and prevention with greater accuracy.

The focus of the experts working in this domain is to reduce unplanned events and transition events from unavoidable to avoidable. Using advanced tools for detection, diagnosis, and enabling timely actions to be taken, companies have managed to reduce their downtime costs significantly. The diversity of solutions that are available in the maintenance area covers both plant and process subsystems.

Some of the data science techniques deployed in the maintenance domain are briefly described below:

Condition Monitoring
This has been used to monitor and analyze process systems over time, and predict the occurrence of an anomaly. These events or anomalies could have short or long propagation times such as the ones seen in the fouling in exchangers or in the cavitation in pumps. The spectrum of solutions in this area includes real-time/offline modes of analysis, edge/IoT devices and open/closed loop prescriptions, and more. In some cases, monitoring also involves the use of soft sensors to detect fouling, surface roughness, or hardness – these parameters cannot be measured directly using a sensor and therefore, need surrogate measuring techniques.

Perhaps one of the most unique challenges working in the manufacturing domain is in the use of data reconciliation. Sensor data tend to be spurious and prone to operational fluctuations, drift, biases, and other errors. Using raw sensor information is unlikely to satisfy the material and energy balance for process units. Data reconciliation uses a first-principles understanding of the process systems and assigns a ‘true value’ to each sensor. These revised sensor values allow a more rigorous approach to condition monitoring, which would otherwise expose process systems to greater risk when using raw sensor information. Sensor validation, a technique to analyze individual sensors in tandem with data reconciliation, is critical to setting a strong foundation for any analytics models to be deployed. These elaborate areas of work ensure a greater degree of success when deploying any solution that involves the use of sensor data.

Fault Detection
This is a mature area of work and uses solutions ranging from those that are driven entirely by domain knowledge, such as pump curves and detection of anomalies thereof, to those that rely only on historical sensor/maintenance/operations data for analysis. An anomaly or fault is defined as a deviation from ‘acceptable’ operation but the context and definitions need to be clearly understood when working with different clients. Faults may be related to equipment, quality, plant systems, or operability. A good business context and understanding of client requirements are necessary for the design and deployment of the right techniques. From basic tools that use sensor thresholds, run charts, and more advanced techniques such as classification, pattern analysis, regression, a wide range of solutions can be successfully deployed.

Early Warning Systems
The detection of process anomalies in advance helps in the proactive management of abnormal events. Improving actionability or response time allows faults to be addressed before setpoints/interlocks are triggered. The methodology varies across projects and there is no ‘one-size-fits-all’ approach. Problem complexity could range from using single sensor information as lead indicators (such as using sustained pressure loss in a vessel to identify a faulty gasket that might rupture) to far more complex methods of analysis.

Typical challenges faced in developing early warning systems are in the 100% detectability of anomalies but an even larger challenge is in filtering out false indications of anomalies. The detection of 100% of the anomalies and the robust filtering techniques are critical factors to consider for successful deployment.

Enhanced Insights for Fault Identification
The importance of detection and response time in the prevention of an event cannot be overstated. But what if an incident is not easy to detect or the propagation of the fault is too rapid to allow us any time for action? The first level involves using machine-driven solutions for detection such as computer vision models, which are rapidly changing the landscape. Using these models, it is now possible to improve prediction accuracies of processes that were either not monitored or used manual monitoring. The second is to integrate the combined expertise of personnel from various job functions such as technologists, operators, maintenance engineers, and supervisors. At this level of maturity, the solution is able to baseline with the best that current operations aim to achieve. The third and by far the most complex is to move more faults in the ‘detectable’ and actionable realm. One such case was witnessed in a complex process from the metal smelting industry. Advanced-Data Science techniques using a digital twin amplified signal responses and analyzed multiple process parameters to predict the occurrence of an incident ahead of time. By gaining order of magnitude improvement in response time, it was possible to move the process fault from an unavoidable to an avoidable and actionable category.

With the context provided above, it is possible to choose a modeling approach and customize the solutions to suit the problem landscape:

data analytics in process system maintenance

Different approaches to Data Analytics

Domain-driven solution
First-principles and the rule-based approach is an example of a domain-driven solution. Traditional ways of delivering solutions for manufacturing often involve computationally intensive solutions (such as process simulation, modeling, and optimization). In one of the difficult-to-model plants, deployment was done using rule engines that allow domain knowledge and experience to determine patterns and cause-effect relationships. Alarms were triggered and advisories/recommendations were sent to the concerned stakeholders regarding what specific actions to undertake each time the model identified an impending event.

Domain-driven approaches also come in handy in the case of ‘cold start’ where solutions need to be deployed with little or no data availability. In some deployments in the mechanical domain, the first-principles approach helped identify >85% of the process faults even at the start of operations.

Pure data-driven solutions
A recent trend seen in the process industry is the move away from domain-driven solutions due to challenges in getting the right skills to deploy solutions, computation infrastructure requirements, customized maintenance solutions, and the requirement to provide real-time recommendations. Complex systems such as naphtha cracking, alumina smelting which are hard to model have harnessed the power of data science to not just diagnose process faults but also enhance response time and bring more finesse to the solutions.

In some cases, domain-driven tools have provided high levels of accuracy in analyzing faults. One such case was related to compressor faults where domain data was used to classify them based on a loose bearing, defective blade, or polymer deposit in the turbine subsystems. Each of these faults was identified using sensor signatures and patterns associated with it. Besides getting to the root cause, this also helped prescribe action to move the compressor system away from anomalous operation.

These solutions need to bear in mind that the operating envelope and data availability covers all possible scenarios. The poor success of deployments using this approach is largely due to insufficient data that covers plant operations and maintenance. However, the number of players offering a purely data-driven solution is large and soon replacing what was traditionally part of a domain engineer’s playbook.

Blended solutions
Blended solutions for the maintenance of process systems combine the understanding of both data science and domain. One such project was in the real-time monitoring and preventive maintenance of >1200 HVAC units across a large geographic area. The domain rules were used to detect and diagnose faults and also identify operating scenarios to improve the reliability of the solutions. A good understanding of the domain helps in isolating multiple anomalies, reducing false positives, suggesting the right prescriptions, and more importantly, in the interpretability of the data-driven solutions.

The differentiation comes from using the combined intelligence from AI / ML models, domain knowledge and knowledge of deployment success are integrated into the model framework.

Customizing the toolkit and determining the appropriate modeling approach are critical to delivery. Given the uniqueness of each plant and problem and the requirement for a high degree of customization, makes the deployment of solutions in a manufacturing environment is fairly challenging. This fact is validated by the limited number of solution providers serving this space. However, the complexity and nature of the landscape need to be well understood by both the client and the service provider. It is important to note that not all problems in the maintenance space are ‘big data’ problems requiring analysis in real-time, using high-frequency data. Some faults with long propagation times can use values averaged over a period of time while other systems with short response time requirements may require real-time data. Where maintenance logs and annotations related to each event (and corrective action) are recorded, one could go with a supervised learning approach, but this is not always possible. In cases where data on faults and anomalies are not available, a one-class approach to classify the operation into normal/abnormal modes has also been used. Solution maturity improves with more data and failure modes identified over time.

A staged solution approach helps in bringing in the right level of complexity to deliver solutions that evolve over time. Needless to say, it takes a lot of experience and prowess to marry the generalized understanding with the customization that each solution demands.

Edge/IoT

A fair amount of investment needs to be made at the beginning of the project to understand the hardware and solution architecture required for successful deployment. While the security of data is a primary consideration, other factors such as computational power, cost, time, response time, open/closed-loop architecture are added considerations in determining the solution framework. Experience and knowledge help understand additional sensing requirements and sensor placement, performance enhancement through edge/cloud-based solutions, data privacy, synchronicity with other process systems, and much more.

By far, the largest challenge is witnessed on the data front (sparse, scattered, unclean, disorganized, unstructured, not digitized, and so on) that prevent businesses from seeing quick success. Digitization and creating data repositories, which set the foundation for model development, take a lot of time.

There is also a multitude of control systems, specialized infrastructure, legacy systems within the same manufacturing complex that one may need to work through. End-to-end delivery with the front-end complexity in data management creates a significant entry barrier for service providers in the maintenance space.

Maintenance cuts across multiple layers of a process system. The maintenance solutions vary as one moves from a sensor to a control loop, equipment with multiple control valves all the way to a flowsheet/enterprise layer. Maintenance across these layers requires a deep understanding of both the hardware as well as process aspects, a combination that is often hard to put together. Sensors and control valves are typically maintained by those with an Instrumentation background, while equipment maintenance could fall in a mechanical or chemical engineer’s domain. On the other hand, process anomalies that could have a plant-level impact are often in the domain of operations/technology experts or process engineers.

Data Science facilitates the development of insights and generalizations required to build understanding around a complex topic like maintenance. It helps in the generalization and translation of learnings across layers within the process systems from sensors all the way to enterprise and other industry domains as well. It is a matter of time before analytics-driven solutions that help maintain safe and reliable operations become an integral part of plant operations and maintenance systems. We need to aim towards the successes that we witness in the medical diagnostics domain where intelligent machines are capable of detecting and diagnosing anomalies. We hope that similar analytics solutions will go a long way to keep plants safe, reduce downtime and provide the best of operations efficiencies that a sustainable world demands.

Today, the barriers to success are in the ability to develop, a clear understanding of the problem landscape, plan end-to-end and deliver customized solutions that take into account business priorities and ROI. Achieving success at a large scale will demand reducing the level of customization required in each deployment – a constraint that is overcome by few subject matter experts in the area today.

The post Data Science Strategies for Effective Process System Maintenance appeared first on Tiger Analytics.

]]>
https://www.tigeranalytics.com/perspectives/blog/harness-power-data-science-maintenance-process-systems/feed/ 1
Defining Financial Ethics: Transparency and Fairness in Financial Institutions’ use of AI and ML https://www.tigeranalytics.com/perspectives/blog/transparency-financial-institutions-use-artificial-intelligence-machine-learning/ Fri, 10 Dec 2021 19:35:26 +0000 https://www.tigeranalytics.com/?p=6785 While time, cost, and efficiency have seen drastic improvement thanks to AI/ML, concerns over transparency, accountability, and inclusivity prevail. This article provides important insight into how financial institutions can maintain a sense of clarity and inclusiveness.

The post Defining Financial Ethics: Transparency and Fairness in Financial Institutions’ use of AI and ML appeared first on Tiger Analytics.

]]>
The last few years have seen a rapid acceleration in the use of disruptive technologies such as Machine Learning and Artificial Intelligence in financial institutions (FI). Improved software and hardware, coupled with a digital-first outlook, has led to a steep rise in the use of such applications to advance outcomes for consumers and businesses alike.

By embracing AI/ML, the early adopters in the industry have been able to streamline decision processes involving large amounts of data, avoid bias, and reduce chances of error and fraud. Even the more traditional banks are investing in AI systems are using state-of-the-art ML and deep learning algorithms that have paved the way for quicker and better reactions to the changing consumer needs and market dynamics.

The Covid-19 pandemic has only aided in making the use of AI/ML-based tools more widespread and easily scalable across sectors. At Tiger Analytics, we have been at the heart of the action and have assisted several clients to reap the benefits of AI/ML across the value chain.
Pilot-use cases where FIs have seen success by using AI/ML-based solutions:

  • Smarter risk management
  • Real-time investment advice
  • Enhanced access to credit
  • Automated underwriting
  • Intelligent customer service and chatbots

The challenges

While time, cost, and efficiency have seen drastic improvement thanks to AI/ML, concerns over transparency, accountability, and inclusivity prevail. Given how highly regulated and impactful the industry is, it becomes pertinent to maintain a sense of clarity and inclusiveness.
Problems in governance of AI/ML:

  • Transparency
  • Fairness
  • Bias
  • Reliability/soundness
  • Accountability

How can we achieve this? By, first and foremost, finding and evaluating safe and responsible ways to integrate AI/ML into everyday processes to better suit the needs of clients and customers.

By making certain guidelines uniform and standardized, we can set the tone for successful AI/ML implementation. This involves robust internal governance processes and frameworks, as well as timely interventions and checks, as outlined in Tiger’s response document and comments to the regulatory agencies in the US.

These checks become even more relevant where regulatory standards or guidance are inadequate specifically on the use of AI in the FI. However, efforts are being made to hold FIs against some kind of standard.

The table below illustrates the issuance of AI guidelines across different countries:

artificial intelligence for financial institutions

Source: FSI Insights on Policy Implementation No. 35, By Jeremy Prenio & Jeffrey Yong, August 2021

Supervisory guidelines and regulations must be understood and customized to suit the needs of the various sectors.

To overcome these challenges, this step of creating uniform guidance by the regulatory agencies is essential — it opens up a dialogue on the usage of AI/ML-based solutions, and also brings in different and diverse voices from the industry to share their triumphs and concerns.

Putting it out there

As a global analytics firm that specializes in creating bespoke AI and ML-based solutions for a host of clients, at Tiger, we recognize the relevance of a framework of guidelines that enable feelings of trust and responsibility.

It was this intention of bringing in more transparency that led us to put forward our response to the Request for Information and Comment on Financial Institutions’ Use of Artificial Intelligence, including Machine Learning (RFI) by the following agencies:

  • Board of Governors of the Federal Reserve System (FRB)
  • Bureau of Consumer Financial Protection (CFPB)
  • Federal Deposit Insurance Corporation (FDIC)
  • National Credit Union Administration (NCUA) and,
  • Office of the Comptroller of the Currency (OCC)

Our response to the RFI is structured in such a way that it is easily accessible to even those without the academic and technical knowledge of AI/ML. We have kept the conversation generic, steering away from deep technical jargon in our views.

Ultimately, we recognize that the role of regulations around models involving AI and ML is to create fairness and transparency for everyone involved.

Transparency and accountability are foundation stones at Tiger too, which we apply and deploy while developing powerful AI and ML-based solutions to our clients — be it large or community banks, credit unions, fintech, and other financial services.

We are eager to see the outcome of this exercise and hope that it will result in consensus and uniformity of definitions, help in distinguishing facts from myth, and allow for a gradation of actual and perceived risks arising from the use of AI and ML models.

We hope that our response not only highlights our commitment to creating global standards in AI/ML regulation, but also echoes Tiger’s own work culture and belief system of fairness, inclusivity, and equality.

Want to learn more about our response? Refer to our recent interagency submission.

You can download Tiger’s full response here.

The post Defining Financial Ethics: Transparency and Fairness in Financial Institutions’ use of AI and ML appeared first on Tiger Analytics.

]]>
Revolutionizing SMB Insurance with AI-led Underwriting Data Prefill Solutions https://www.tigeranalytics.com/perspectives/blog/data-prefill-enables-insurers-accelerate-commercial-underwriting/ Wed, 29 Sep 2021 17:10:55 +0000 https://www.tigeranalytics.com/?p=5766 US SMBs often struggle with complex and time-consuming insurance processes, leading to underinsurance. Tiger Analytics’ AWS-powered prefill solution offers a customizable, accurate, and cost-saving approach. With 95% data accuracy, a 90% fill rate, and potential $10M annual savings, insurers can streamline underwriting, boost risk assessment, and gain a competitive edge.

The post Revolutionizing SMB Insurance with AI-led Underwriting Data Prefill Solutions appeared first on Tiger Analytics.

]]>
Small-and-medium-sized businesses often embark on unrewarding insurance journeys. There are about 28 million such businesses in the US that require at least 4-5 types of insurance. Over 70% of them are either underinsured or have no insurance at all. One reason is that their road to insurance coverage can be long, complex, and unpredictable. While filling out commercial insurance applications, SMB owners face several complicated questions for which crucial information is either not readily available or poorly understood. Underwriters, however, need this information promptly to estimate risks associated with extending the coverage. It makes the overall commercial underwriting process extremely iterative, time-consuming, and labor-intensive.

For instance, business owners need to answer over 40 different questions when they apply for worker’s compensation insurance. In addition, it could take many weeks of constant emailing between insurance companies and businesses after submission! Such bottlenecks lead to poor customer experiences while significantly impacting the quote-to-bind ratio for insurers. Furthermore, over 20% of the information captured from businesses and agents is inaccurate – resulting in premium leakage and poor claims experience.

The emergence of data prefill – and the challenges ahead

Today, more insurers are eager to pre-populate their commercial underwriting applications by using public and proprietary data sources. The data captured from external sources help them precisely assess risks across insurance coverages, including Workers Compensation, General Liability, Business Property, and Commercial Auto. For example, insurers can explore company websites and external data sources like Google Maps, OpenCorporates, Yelp, Zomato, Trip Advisor, Instagram, Foursquare, Kompass, etc. These sources provide accurate details, such as year of establishment, industry class, hours of operation, workforce, physical equipment, construction quality, safety standards, and more.

However, despite the availability of several products that claim to have successfully prefilled underwriting data, insurance providers continue to grapple with challenges like evolving business needs and risks, constant changes in public data format, ground truth validation, and legal intricacies. Sources keep evolving over time both in terms of structure and data availability. Some even come with specific legal constraints. For instance, scraping is prohibited by many external websites. Moreover, the data prefill platform needs to fetch data from multiple sources, which requires proper source prioritization and validation.

Insurers have thus started to consider building custom white-box solutions that are configurable, scalable, efficient, and compliant.

Creating accurate, effortless, and fast commercial underwriting journeys

The futuristic data prefill platforms can empower business insurance providers to prefill underwriting information effortlessly and accurately. These custom-made platforms are powered by state-of-art data matching and extraction frameworks, a suite of advanced data science techniques, triangulation algorithms, and scalable architecture blueprints. The platform empowers underwriters to directly extract data from external sources with a high fill rate and great speed. Where the data is not directly available, the ML classifiers help predict underwriting questions for underwriters with high accuracy.

Tiger Analytics has assisted in custom-building such AI-led underwriting data prefill solutions to support various commercial underwriting decisions for leading US-based Worker’s compensation insurance providers. Our data prefill solution uses various AWS services such as AWS Lambda, S3, EC2, Elastic Search, Sage maker, Glue, Cloudwatch, RDS, and API Gateway; which ensures increased speed-to-market and scalability – with improvements gained through incremental addition of each source. It is a highly customizable white-box solution with a built-in Tiger’s philosophy of Open IP. Using AWS services allows the solution to be quickly and cost-effectively tweaked to cater to any changes in external source formats. Delivered as an AWS cloud-hosted solution, this solution uses AWS Lambda architecture to enable scale and state-of-the-art application orchestration engine to prefill data for commercial underwriting purposes.

Key benefits

  • Unparalleled accuracy of 95% on all the data provided by the platform
  • Over 90% fill rate
  • Significant cost savings of up to $10 million annually
  • Accelerated value creation by enabling insurers to start realizing value within 3-6 months

Insurers must focus on leveraging external data sources and state-of-the-art AI frameworks, data science models, and data engineering components to prefill applications. And with the right data prefill platform, insurers can improve the overall quote-to-bind ratio, assess risks accurately and stay ahead of the competition. 

The post Revolutionizing SMB Insurance with AI-led Underwriting Data Prefill Solutions appeared first on Tiger Analytics.

]]>
Waste No More: Making a Difference with Tiger Analytics’ Data-Driven Solution for a Greener Tomorrow https://www.tigeranalytics.com/perspectives/blog/advanced-analytics-commercial-waste-management-system/ Mon, 20 Sep 2021 23:55:07 +0000 https://www.tigeranalytics.com/?p=5731 Improper commercial waste management devastates the environment, necessitating adherence to waste management protocols. Tiger Analytics’ solution for a waste management firm enhanced accuracy, efficiency, and compliance, promoting sustainable practices.

The post Waste No More: Making a Difference with Tiger Analytics’ Data-Driven Solution for a Greener Tomorrow appeared first on Tiger Analytics.

]]>
Improper commercial waste management has a devastating impact on the environment. The realization may not be sudden, but it is certainly gathering momentum – considering that more companies are now looking to minimize their impact on the environment. Of course, it’s easier said than done. Since the dawn of the 21st century, the sheer volume and pace of commercial growth have been unprecedented. But the fact remains that smart waste management is both a business and a social responsibility.

Importance of commercial waste management

The commercial waste management lifecycle comprises collection, transportation, and disposal. Ensuring that all the waste materials are properly handled throughout the process is a matter of compliance. After all, multiple environmental regulations dictate how waste management protocols should be implemented and monitored. Instituting the right waste management guidelines also helps companies fulfill their ethical and legal responsibility of maintaining proper health and safety standards at the workplace.

For instance, all the waste materials generated in commercial buildings are stored in bins placed at strategic locations. If companies do not utilize them effectively, it will lead to bin overflows causing severe financial, reputational, and legal repercussions.

Impact of data analytics on commercial waste management

Data analytics eliminates compliance issues that stem from overflowing bins by bridging any operational gaps. In addition, it provides the precise know-how for creating intelligent waste management workflows. With high-quality video cameras integrated into the chassis of their waste collection trucks, image-based analytics can be captured and shared through a cloud-hosted platform for real-time visual detection. From these, data insights can be extracted to evaluate when the trash bins are getting filled and schedule the collection to minimize transportation, fuel, and labor expenses. They can also determine the right collection frequency, streamline collection routes, and optimize vehicle loads.

By monitoring real-time data from the video cameras, the flow of waste in each bin can be managed promptly to avoid compliance-related repercussions. The trucks also receive real-time data on the location of empty bins, which helps them chart optimal routes and be more fuel-conscious.

Ultimately, leveraging sophisticated data analytics helps build a leaner and greener waste management system. In addition, it can improve operational efficiency while taking an uncompromising stance on environmental and health sustainability.

Tiger Analytics’ waste management modeling use case for a large manufacturer

Overflowing bins are a severe impediment in the waste management process as they increase the time required to process the waste. Waste collection trucks will have to spend more time than budgeted for ensuring that they handle overflowing bins effectively – without any spillage in and around the premises. It is also difficult for them to complete their trips on time. When dealing with commercial bins, the situation is even more complicated. The size and contents of the commercial bins vary based on the unique waste disposal requirements of businesses.

Recently, Tiger Analytics worked with a leading waste management company to harness advanced data analytics to improve compliance concerning commercial waste management.

Previously, the client had to record videos of the waste pick up process and send them for manual review. The videos were used to identify the commercial establishments that did not follow the prescribed norms on how much waste could be stored in a bin. However, their video review process was inefficient and tedious.

When the pick up takes place, the manual reviewer is expected to watch hours of video clips and images captured by each truck to determine violators. Thus, there was an uncompromising need for accuracy since overflowing bins led to compliance violations and potential penalties.

Tiger Analytics developed a solution that leveraged video analytics to help determine whether a particular bin in an image was overflowing or not. Using cutting-edge deep learning algorithms, the solution enabled a high level of accuracy and eliminated all activities related to the manual video review and the associated costs.

Tiger Analytics’ solution was based on a new data classification algorithm that increased the efficiency of the waste collection trucks. Based on the sensor data collected from the chassis, we empowered the client to predict the collection time when the truck was five seconds away from being in the vicinity of a bin. Furthermore, with advanced monitoring analytics, we reduced the duration of the review process from 10 hours to 1.5 hours, which boosted workforce efficiency too.

As a result, the client could effortlessly de-risk their waste management approach and prevent overflow in commercial bins. Some of the business benefits of our solution were:

  • More operational efficiency by streamlining how pickups are scheduled
  • Smarter asset management through increased fuel efficiency and reduced vehicle running costs
  • Improved workforce productivity – with accelerated critical processes like reviewing videos to confirm the pickup
  • Quick risk mitigation of any overflow negligence that leads to compliance violations

Conclusion

New avenues of leveraging advanced analytics continue to pave the way for eco-conscious and sustainable business practices. Especially in a highly regulated sector like commercial waste management, it provides the much-needed accuracy, convenience, and speed to strengthen day-to-day operations and prevent compliance issues.

Day by day, commercial waste management is growing into a more significant catalyst for societal progress. As mentioned earlier, more companies are becoming mindful of their impact on the environment. In addition, the extent of infrastructure development has taken its toll – thereby exponentially increasing the need to optimize waste disposal and collection methods. Either way, data provides a valuable understanding of how it should be done. 

This article was first published in Analytics India Magazine.

The post Waste No More: Making a Difference with Tiger Analytics’ Data-Driven Solution for a Greener Tomorrow appeared first on Tiger Analytics.

]]>