IoT and Analytics in Auto Insurance

Internet of Things (IoT) is a network of connected physical objects embedded with sensors. IoT allows these devices to communicate, analyze and share data about the physical world around us via networks and cloud-based software platforms.

In the current scenario, IoT is one of the most important phenomena revolutionizing the technological and business spheres. Several industries such as Agriculture, Healthcare, Retail, Transportation, Energy, and manufacturing are leveraging IoT to solve long-standing industry-wide challenges and thus transforming the way they function. For example, in Manufacturing, sensors placed in the various equipment collecting data about their performance are enabling pre-emptive maintenance and providing insights to improve overall efficiency. In Retail, “things” such as RFID inventory tracking chips, in-store infrared foot-traffic counters, digital signage, a kiosk, or even a customer’s mobile device is enabling retailers to provide location-specific customer engagement, in-store energy optimization, real-time inventory replenishment, etc.

The Insurance industry, on the other hand, has been rather sluggish by virtue of its size and inherent traditional nature. They cannot, however, afford to continue a wait-and-watch attitude towards IoT. Insurance is, interestingly, one of the industries that are bound to be most impacted by various technological leaps that are being made. IoT, blockchain, Big Data are all expected to push Insurance to evolve into a different beast altogether, including a shift from restitution to prevention.

Primary IoT Use-cases That Insurers Have Adopted

  1. Connected cars: Many auto insurers have been collecting and analyzing data from sensors in cars to track drivers’ behaviors real-time and thus providing usage-based insurance (UBI).
  2. Connected homes: Sensors in homes that can detect smoke and water levels can lower frequency and severity of damages by automatically sending out messages to the homeowners, fire department or other maintenance service providers, in any event, requiring attention. Certain connected doorbells are capable of preventing burglaries, while other devices provide remote home surveillance.
  3. Connected people: Wearable fitness trackers provide data to insurers that help them underwrite their health insurances better and advice preventive care. These trackers also enable the wearers to lead a healthier lifestyle, thus reducing their premiums and frequency and severity for the insurers.

Though some of the content can be applicable to other Lines of Business, in this article, I shall focus on leveraging IoT in Auto Insurance. Please note that the steps and assumptions of actions taken are based on a specific case-study. The specificities may vary for other Insurance providers based on existing policies, location, technological and data maturity, etc. The intention is to provide a detailed example. This study can be replicated for other providers and recommendations can be made accordingly.

IoT has the potential to impact almost every facet of Auto Insurance. The preventive and underwriting areas have already received sufficient focus. Data from sensors in the cars can help understand and analyze driver behavior and thus profile risky driving behavior. This has enabled a much-appreciated shift from usage-based underwriting from the traditional demographic-based underwriting. Here, it is important to point out that just driving behavior metrics such as speed and number of sudden breaks is not sufficient to assign a risk profile to the driver. These metrics should be analyzed in the context of the location, usual routes were taken, average driving behavior in the area, etc. to truly judge one’s driving behavior. This requires assimilation of multiple data sources.

 

Claims Management

The insurance buyer demographic has shifted to one that prefers everything here and now. They prefer dealing with things remotely from the comfort of their offices and consider the need of heavy paperwork and human presence primitive. Application of IoT in the improvement of Claims management is at a very nascent stage but could have a tremendous impact on the claim handling turn-around time, accuracy of investigation and customer satisfaction.

 

  1. Accident/Event and FNOL
  2. Workshop Assignment
  3. Investigation and Fraud detection
  4. Miscellaneous
  • Document maintenance: Since a lot of hard-copy of documents are still used in Insurance, RFID can be used for document tagging and maintenance.
  • Once the repair has been completed, the various sensors of the car can do a self-check to ensure the parts they are connected are in good working condition.
  • Feedback: The mobile app will be a much more effective method of collecting feedback from the customers than paper forms and telephone calls.

 

The Wealth of Data Generated

  • As discussed above, underwriting of policies will improve drastically even for 1st-time buyers.
  • Efficient visualization and automated insight generation to provide reliable and concise.information about their driving behavior to the drivers themselves will help them become safer drivers.
  • IoT-based analytics can be used to predict future events such as:
    1. Major weather patterns – Based on this Insurance companies can prepare for various catastrophes and improve locality-based underwriting.
    2. The data will enable Insurance companies to identify accident-prone weather, roads, driving behavior and combinations thereof. The Insurer can then advise the driver accordingly. For example, the insurer may inform the driver that he ought to avoid a particular route in a particular kind of weather since accident probability of that combination would be 30%.
  • With all the additional data, various important profiles and segments may emerge that will form the foundation for propensity estimation and developing effective targeting strategies.

Criticality Of Analytics In Utilizing IoT Data

Analytics is a critical component of using IoT data to ensure maximum benefit

  • Policy buyer level information can be used to evaluate the risk associated with the buyer and legitimacy of claims
  • Analyzing the population as such can identify customer segments and determine their needs. Coupled with efficient visualization and automated insight generation, insurers will be able to promptly determine any concern and the cause for the same
    1. The data will enable Insurance companies to identify accident-prone weather, roads, driving behavior and combinations thereof. The Insurer can then advise the driver accordingly. For example, the insurer may inform the driver that he ought to avoid a specific route in a particular kind of weather since accident probability of that combination would be 30%.
  • Analysts can identify significant trends and patterns from data accumulated over a period. This can be incorporated into statistical models that can predict the future for insurers
    1. Based on expected weather patterns and catastrophes, insurance companies can prepare accordingly and improve locality-based underwriting
  • The various policy changes and tests by the insurance companies to deal with changes in the market will also be reflected in the IoT data. This information can be used to determine the optimal action to be taken when an immediate or expected issue needs to be mitigated

IoT Implementation For Insurance Companies

  • In an ideal world, any kind of transformation would be a series of steps with minimal overlap between each other. This is, however, not reality. Insurance companies have assumed that they cannot move on to integrating IoT in claims management until the current data and processes have all been completely digitized. One would like to imagine that given the amount of time and money that has gone into digitization, all organizations above at least mid-size would have all their data digitized and well-synchronized. The reality is, however, a combination of traditional and digitized systems. While a complete online data mart would have been the ideal scenario for IoT integration and to derive the best from it, IoT can also be integrated into such combination systems and still add substantial value by syncing with whatever data is online and clean.
  • Analytically sound database structure and ease of analysis are critical while setting up IoT system. The database design should be in such a way that all the required information should be stored without error and all current and future analysis can be carried out with relative ease. This can be done only under the supervision of able and experienced analytics practitioners.
  • IoT for usage-based insurance is no longer a choice for providers. If they do not implement it right away, they will be left with a policy portfolio of higher risk drivers.
  • Managing voluminous multisource data and organizing the technological resources.

 

References

Senjuti Bhattacharyya

Senjuti Bhattacharyya

Consultant at Affine

More Posts

Follow Me:
LinkedIn

Statistical Model Lifecycle Management

Organizations have realized quantum jumps in business outcomes through the institutionalization of data-driven decision making. Predictive Analytics, powered by the robustness of statistical techniques, is one of the key tools leveraged by data scientists to gain insight into probabilistic future trends. Various mathematical models form the DNA of Predictive Analytics.

A typical model development process includes identifying factors/drivers, data hunting, cleaning and transformation, development, validation – business & statistical and finally productionalization. In the production phase, as actual data is included in the model environment, true accuracy of the model is measured. Quite often there are gaps(error) between predicted and actual numbers. Business teams have their own heuristic definitions and benchmark for this gap and any deviation leads to forage for additional features/variables, data sources and finally resulting in rebuilding the model.

Sourav Mazumdar

Sourav Mazumdar

Manager Client Delivery at Affine Analytics

More Posts

Follow Me:
LinkedIn

Analytics For Non-Profit Organisations

Analytics have been growing at a rapid pace across the world. The well-established companies have realized the importance of analytics in their business where crucial decisions are taken that drives their revenue. But why do just the well-established corporates need to leverage this statistical and computational modus operandi when it can be implemented in a much-needed arena also?

The idea is to get to use the analytics for non-profit social organizations and provide a breakthrough. These are the organizations which strive to look for the upliftment of society by identifying the social responsibilities. The organizations cover a wide variety of aspects that helps to promote education, health, food, shelter etc

Rajasekaran Badrinarayanan

Rajasekaran Badrinarayanan

Business Analyst at Affine Analytics

More Posts

Follow Me:
LinkedIn

Changing Business Requirements In Demand Forecasting

Affine recently completed 6 years, I have been a part of it for about 3 of those years. As an analytics firm, the most common business problem that we have come across is that of forecasting consumer demand. This is particularly true for Retail and CPG clients.

Over the last few years have dealt with simple forecasting problems for which we can use very simple time-series forecasting techniques like ARIMA and ARIMAX or even linear regression these are forecasts which are more at an organization or for specific business divisions. But over the years we have seen a distinct shift in focus of all our clients to get forecasts at a more granular level, sometimes for even specific items. These forecasts are difficult to attain using simple techniques. This is where more sophisticated techniques come into play. These techniques are the more complex machine learning techniques which include RF, XG Boost etc.

Priyankar Sengupta

Priyankar Sengupta

Manager at Affine Analytics

More Posts

Follow Me:
LinkedIn

The Evolution of Data analytics – Then, Now and Later

The loose definition of data science is to analyze data of a business, to be able to produce actionable insights and recommendations for the business. The simplicity or the complexity of the analysis, aka the level of “Data Science Sophistication” also impacts the quality and accuracy of results. The sophistication is essentially a function of 3 main data science components – technological skills, math/stats skills and the necessary business acumen to define and deliver a relevant business solution. These 3 pillars have very much been the mainstay of data science ever since it started getting embraced by the businesses over the past two decades and should continue to be even in the future. What, however, has changed or will change in the future is the underlying R&D in the areas of technology and statistical techniques. I have not witnessed many other industries where these skills are becoming obsolete at such fast rate. Data Science is unique in its requirement of the data scientist and the consulting firms to constantly update their skills and be very futuristic in adopting new and upcoming skills. This article is an attempt to look at how the tool/tech aspects of data science have evolved over the past few decades, and more importantly what the future holds for this fascinating tech and innovation driven field.

Vineet Kumar

Vineet Kumar

Co-founder at Affine

More Posts

Follow Me:
LinkedIn

Demand Forecasting – Doing It The Right Way

Importance Of Predicting Demand For Organizations

In a world of extreme competition, expense reduction being the mantra for most organizations, primarily in the retail and CPG industries, they try to focus on cost cutting and maintaining optimum levels of inventory to gain the competitive edge. To accomplish this, forecasting demand is of utmost importance. It is also not enough to have a macro level sales forecast for the entire organization.

Efficient and accurate demand forecast enables organizations, to anticipate demand and consequently allocate the optimal amount of resources to minimize stagnant inventory. This will result in negligible wastage of resources as well as reduction of costs such as storage cost, transportation cost etc. Another side effect of accurate demand prediction is the prevention of shrinkage so that firms don’t have to give huge discounts to clear stock.

This excerpt will touch upon the steps in demand forecasting and briefly, talk about the different demand forecasting methods. The article ends with some challenges of demand forecasting.

Shuddhashil Mullick

Shuddhashil Mullick

Delivery Manager at Affine Analytics

More Posts

Follow Me:
LinkedIn

Deep Learning Demystified 2: Dive deep into Convolutional Neural Networks

The above photo is not created by a specialized app or photoshop. It was generated by a Deep learning algorithm which uses convolutional networks to learn artistic features from various paintings and changes any photo depicting how an artist would have painted it.

Convolutional Neural Networks has become part of every state of the art solutions in areas like

  1. Image recognition
  2. Object Recognition
  3. Self-driving cars in identifying pedestrians, objects.
  4. Emotion recognition.
  5. Natural Language Processing.

A few days back Google surprised me with a video called Smiles 2016 where all the photos of 2016 where I was partying with family, friends, colleagues are put together. It was a collection of photos where everyone in the photo was smiling. Emotion recognition. We will discuss a couple of Deep learning architectures that powers these applications in this blog.

Before we dive into CNN lets try to understand why not Feed Forward Neural network. According to universality theorem which we discussed in the previous blog, any network will be able to approximate a function just by adding Neurons(Functions), but there are no guarantees in time when will it reach the optimal solution. Feed Forward neural networks tend to flatten images to a flat vector thus losing all the spatial information that comes with an Image. So for problems where spatial feature importance is high CNN tend to achieve higher accuracy in a very shorter time compared to Feed-Forward Neural Networks.

Before we dive into what a Convolutional Neural Network is letting get comfortable with nuts and bolts which form it.

Images

Before we dive into CNN lets take a look at how a computer looks at an image.

What we see

What a computer sees

Wow, it’s great to know that computer sees images, videos as a matrix of numbers. A common way of looking at an image in computer vision is a matrix of dimensions Width * Height * Channels. Where Channels are Red, Green, Blue and sometimes alpha is also part of channels.

Filters

Filters are a small matrix of numbers usually of size 3*3*3 (width, height, channel) or 7*7*3. Filters perform various operations like blur, sharpen, outline on a given image. Historically these filters are carefully hand picked to gain various features of an image. In our case, CNN creates these filters automatically using a combination of techniques like Gradient descent and Backpropagation. Filters are moved across an image starting from top left to the bottom right to capture all the essential features. They are also called as kernels in Neural networks.

Convolutional

In a convolutional layer, we convolve the filter with patches across an image. For example on the left-hand side of the below image is a matrix representation of a dummy image and the middle layer is the filter or kernel. The right side of the image has the output of convolution layer. Look at the formula in the image to understand how the kernel and a part of the image are combined together to form a new pixel.

 
First pixel in the image being calculated
Let’s see another example of how the next pixel in the image is being generated.
The second pixel in the output image is being calculated.

Max-Pooling

Max pooling is used for reducing dimensionality and down-sampling an input. The best way to understand Max-pooling is an example. The below image describes what a 2*2 Max pooling layer does.

Max-Pooling

In both the examples for convolution and Max-pooling, the image shows for only 2 pixels, but in reality, the same technique is applied to the entire image.

Now with an understanding of all the important components, let’s take a look at how the Convolutional Neural Network looks like.

The example used in Stanford CNN classes

As you can see from the above image, a CNN is a combination of layers stacked together. The above architecture can be simply depicted as CONV-RELU-CONV-RELU_POOL * 3 + Fully connected Layer.

Challenges

Convolutional Neural Networks need huge amounts of labeled data and lots of computation power to get trained. They typically take weeks to get trained to achieve state of the art performance. Most of these architectures like AlexNet, ZF Net, VGG Net, GoogLe Net, Microsoft Res Net take weeks to get trained. Does that mean, an organization without huge volumes of data and computation power cannot take advantage of it? The answer is No.

Transfer learning to the rescue

Most of the winners of the ILSVRC (ImageNet Large-Scale Visual Recognition Challenge) competition has open sourced the architecture and the weights associated with these networks. It turns out, most of the weights particularly that of the filters can be reused after fine tuning to the domain specific problems. So for us to take advantage of these convolutional neural networks, all we need to do is pre-train the last few layers of the network. Which in general takes very little data and computation powers. For several of our scenarios, we were able to train models with state of art performance on GPU machines in few minutes to hours.

Conclusion

Apart from use cases like image recognition , CNN is being widely used in various network topology’s like object recognition (What objects are located in images), GAN (A recent breakthrough in helping computers create realistic images), converting low resolution images to high resolution images , in revolutionising health sector in various forms of cancer detection and many more. In recent months there were architectures built for NLP achieving state of art results.

Vishnu Subramanian

Vishnu Subramanian

Practice Head - BigData & Advanced Analytics at Affine

More Posts

Follow Me:
LinkedIn

HowStat – Application of Data Science in Cricket

Data science helps us to extract knowledge or insights from data- either structured or unstructured- by using scientific methods like mathematical or statistical models. In the last two decades, it has been one of the most popular fields with the rise of all big data technologies. A lot of companies have been using recommendation engines to promote their products/suggestions in accordance with users’ interests such as Amazon, Netflix, Google Play. A lot of other applications like image recognition, gaming, or Airline route planning also involves the usage of big data and data science.

Sports is another field which is using data science extensively to improve strategies and predicting match outcomes. Cricket is a sport where machine learning has scope to dive into quite a large outfield. It can go a long way towards suggesting optimal strategies for a team to win a match or a franchise to bid a valuable player.

Suvajit Sen

Suvajit Sen

Senior Business Analyst at Affine Analytics

More Posts

Follow Me:
LinkedIn

Intrusion/Threat Detection Systems

Recently, when I was reading up on Cyber Security & Threat Detection, I came across “The Annual Data Breach Report by Verizon”. The report analyzed thousands of such incidents reported by various companies, public & private organizations which happened over the last couple of years. The report analyzed breaches by firmographics, geographies, industries etc. and found that cyber intrusion is a growing threat to every industry based in every country of the world. The report proves time and again that “No single industry or organization in the world is safe from Cyber Threats”. This piqued my curiosity & we felt that we could use all the goodness of data science to effectively tackle this problem. I designed a Threat/Intrusion Detection System, which could be used to detect such data leaks/breaches & take a preventive action to contain, if not stop the damage due to breach.

Aayush Agrawal

Aayush Agrawal

A data science enthusiast and an entrepreneur working on Marketing ROI Measurement using IOT & Big Data solutions.

More Posts

Follow Me:
LinkedIn

Deep Learning Demystified

What is Deep Learning?

Traditional Machine Learning had used handwritten features and modality-specific machine learning to classify images, text or recognize voices. Deep learning / Neural network identifies features and finds different patterns automatically. Time to build these complex tasks has been drastically reduced and accuracy has exponentially increased because of advancements in Deep learning. Neural networks have been partly inspired from how 86 billion neurons work in a human and become more of a mathematical and a computer problem. We will see by the end of the blog how neural networks can be intuitively understood and implemented as a set of matrix multiplications, cost function, and optimization algorithms.

Biological analogy of Neural Network

Vishnu Subramanian

Vishnu Subramanian

Practice Head - BigData & Advanced Analytics at Affine

More Posts

Follow Me:
LinkedIn