Large Scale Model Development and Maintenance

The Need For Processes

Across industries, an increasing number of organizations are moving to data-driven decision making. Some of these decisions are contingent on historical data, and trends, whereas some are made on the fly, usually the more critical ones, based on the point in time data. Some of these judgments must be made with the ever-metamorphosing nature of data, whereas some are repetitive, based on data, that does not change all that often.

Our focus for this article are those decisions that are made on a recurring basis, based on processes that can be repetitive in nature. Specifically, processes that can be automated, and re-run with minimal manual intervention. Typically, data-driven automated decision making or insight generating engines, are referred to as ‘models’, and they can be statistical or heuristic in nature. We will talk about the differences in later sections.

Shuddhashil Mullick

Shuddhashil Mullick

Senior Delivery Manager at Affine Analytics

More Posts

Follow Me:
LinkedIn

IoT and Analytics in Auto Insurance

Internet of Things (IoT) is a network of connected physical objects embedded with sensors. IoT allows these devices to communicate, analyze and share data about the physical world around us via networks and cloud-based software platforms.

In the current scenario, IoT is one of the most important phenomena revolutionizing the technological and business spheres. Several industries such as Agriculture, Healthcare, Retail, Transportation, Energy, and manufacturing are leveraging IoT to solve long-standing industry-wide challenges and thus transforming the way they function. For example, in Manufacturing, sensors placed in the various equipment collecting data about their performance are enabling pre-emptive maintenance and providing insights to improve overall efficiency. In Retail, “things” such as RFID inventory tracking chips, in-store infrared foot-traffic counters, digital signage, a kiosk, or even a customer’s mobile device is enabling retailers to provide location-specific customer engagement, in-store energy optimization, real-time inventory replenishment, etc.

The Insurance industry, on the other hand, has been rather sluggish by virtue of its size and inherent traditional nature. They cannot, however, afford to continue a wait-and-watch attitude towards IoT. Insurance is, interestingly, one of the industries that are bound to be most impacted by various technological leaps that are being made. IoT, blockchain, Big Data are all expected to push Insurance to evolve into a different beast altogether, including a shift from restitution to prevention.

Senjuti Bhattacharyya

Senjuti Bhattacharyya

Consultant at Affine

More Posts

Follow Me:
LinkedIn

Statistical Model Lifecycle Management

Organizations have realized quantum jumps in business outcomes through the institutionalization of data-driven decision making. Predictive Analytics, powered by the robustness of statistical techniques, is one of the key tools leveraged by data scientists to gain insight into probabilistic future trends. Various mathematical models form the DNA of Predictive Analytics.

A typical model development process includes identifying factors/drivers, data hunting, cleaning and transformation, development, validation – business & statistical and finally productionalization. In the production phase, as actual data is included in the model environment, true accuracy of the model is measured. Quite often there are gaps(error) between predicted and actual numbers. Business teams have their own heuristic definitions and benchmark for this gap and any deviation leads to forage for additional features/variables, data sources and finally resulting in rebuilding the model.

Sourav Mazumdar

Sourav Mazumdar

Manager Client Delivery at Affine Analytics

More Posts

Follow Me:
LinkedIn

Analytics For Non-Profit Organisations

Analytics have been growing at a rapid pace across the world. The well-established companies have realized the importance of analytics in their business where crucial decisions are taken that drives their revenue. But why do just the well-established corporates need to leverage this statistical and computational modus operandi when it can be implemented in a much-needed arena also?

The idea is to get to use the analytics for non-profit social organizations and provide a breakthrough. These are the organizations which strive to look for the upliftment of society by identifying the social responsibilities. The organizations cover a wide variety of aspects that helps to promote education, health, food, shelter etc

Rajasekaran Badrinarayanan

Rajasekaran Badrinarayanan

Business Analyst at Affine Analytics

More Posts

Follow Me:
LinkedIn

Changing Business Requirements In Demand Forecasting

Affine recently completed 6 years, I have been a part of it for about 3 of those years. As an analytics firm, the most common business problem that we have come across is that of forecasting consumer demand. This is particularly true for Retail and CPG clients.

Over the last few years have dealt with simple forecasting problems for which we can use very simple time-series forecasting techniques like ARIMA and ARIMAX or even linear regression these are forecasts which are more at an organization or for specific business divisions. But over the years we have seen a distinct shift in focus of all our clients to get forecasts at a more granular level, sometimes for even specific items. These forecasts are difficult to attain using simple techniques. This is where more sophisticated techniques come into play. These techniques are the more complex machine learning techniques which include RF, XG Boost etc.

Priyankar Sengupta

Priyankar Sengupta

Manager at Affine Analytics

More Posts

Follow Me:
LinkedIn

The Evolution of Data analytics – Then, Now and Later

The loose definition of data science is to analyze data of a business, to be able to produce actionable insights and recommendations for the business. The simplicity or the complexity of the analysis, aka the level of “Data Science Sophistication” also impacts the quality and accuracy of results. The sophistication is essentially a function of 3 main data science components – technological skills, math/stats skills and the necessary business acumen to define and deliver a relevant business solution. These 3 pillars have very much been the mainstay of data science ever since it started getting embraced by the businesses over the past two decades and should continue to be even in the future. What, however, has changed or will change in the future is the underlying R&D in the areas of technology and statistical techniques. I have not witnessed many other industries where these skills are becoming obsolete at such fast rate. Data Science is unique in its requirement of the data scientist and the consulting firms to constantly update their skills and be very futuristic in adopting new and upcoming skills. This article is an attempt to look at how the tool/tech aspects of data science have evolved over the past few decades, and more importantly what the future holds for this fascinating tech and innovation driven field.

Vineet Kumar

Vineet Kumar

Co-founder at Affine

More Posts

Follow Me:
LinkedIn

Demand Forecasting – Doing It The Right Way

Importance Of Predicting Demand For Organizations

In a world of extreme competition, expense reduction being the mantra for most organizations, primarily in the retail and CPG industries, they try to focus on cost cutting and maintaining optimum levels of inventory to gain the competitive edge. To accomplish this, forecasting demand is of utmost importance. It is also not enough to have a macro level sales forecast for the entire organization.

Efficient and accurate demand forecast enables organizations, to anticipate demand and consequently allocate the optimal amount of resources to minimize stagnant inventory. This will result in negligible wastage of resources as well as reduction of costs such as storage cost, transportation cost etc. Another side effect of accurate demand prediction is the prevention of shrinkage so that firms don’t have to give huge discounts to clear stock.

This excerpt will touch upon the steps in demand forecasting and briefly, talk about the different demand forecasting methods. The article ends with some challenges of demand forecasting.

Shuddhashil Mullick

Shuddhashil Mullick

Senior Delivery Manager at Affine Analytics

More Posts

Follow Me:
LinkedIn

Deep Learning Demystified 2: Dive deep into Convolutional Neural Networks

The above photo is not created by a specialized app or photoshop. It was generated by a Deep learning algorithm which uses convolutional networks to learn artistic features from various paintings and changes any photo depicting how an artist would have painted it.

Convolutional Neural Networks has become part of every state of the art solutions in areas like

  1. Image recognition
  2. Object Recognition
  3. Self-driving cars in identifying pedestrians, objects.
  4. Emotion recognition.
  5. Natural Language Processing.

A few days back Google surprised me with a video called Smiles 2016 where all the photos of 2016 where I was partying with family, friends, colleagues are put together. It was a collection of photos where everyone in the photo was smiling. Emotion recognition. We will discuss a couple of Deep learning architectures that powers these applications in this blog.

Before we dive into CNN lets try to understand why not Feed Forward Neural network. According to universality theorem which we discussed in the previous blog, any network will be able to approximate a function just by adding Neurons(Functions), but there are no guarantees in time when will it reach the optimal solution. Feed Forward neural networks tend to flatten images to a flat vector thus losing all the spatial information that comes with an Image. So for problems where spatial feature importance is high CNN tend to achieve higher accuracy in a very shorter time compared to Feed-Forward Neural Networks.

Before we dive into what a Convolutional Neural Network is letting get comfortable with nuts and bolts which form it.

Images

Before we dive into CNN lets take a look at how a computer looks at an image.

What we see

What a computer sees

Wow, it’s great to know that computer sees images, videos as a matrix of numbers. A common way of looking at an image in computer vision is a matrix of dimensions Width * Height * Channels. Where Channels are Red, Green, Blue and sometimes alpha is also part of channels.

Filters

Filters are a small matrix of numbers usually of size 3*3*3 (width, height, channel) or 7*7*3. Filters perform various operations like blur, sharpen, outline on a given image. Historically these filters are carefully hand picked to gain various features of an image. In our case, CNN creates these filters automatically using a combination of techniques like Gradient descent and Backpropagation. Filters are moved across an image starting from top left to the bottom right to capture all the essential features. They are also called as kernels in Neural networks.

Convolutional

In a convolutional layer, we convolve the filter with patches across an image. For example on the left-hand side of the below image is a matrix representation of a dummy image and the middle layer is the filter or kernel. The right side of the image has the output of convolution layer. Look at the formula in the image to understand how the kernel and a part of the image are combined together to form a new pixel.

 
First pixel in the image being calculated
Let’s see another example of how the next pixel in the image is being generated.
The second pixel in the output image is being calculated.

Max-Pooling

Max pooling is used for reducing dimensionality and down-sampling an input. The best way to understand Max-pooling is an example. The below image describes what a 2*2 Max pooling layer does.

Max-Pooling

In both the examples for convolution and Max-pooling, the image shows for only 2 pixels, but in reality, the same technique is applied to the entire image.

Now with an understanding of all the important components, let’s take a look at how the Convolutional Neural Network looks like.

The example used in Stanford CNN classes

As you can see from the above image, a CNN is a combination of layers stacked together. The above architecture can be simply depicted as CONV-RELU-CONV-RELU_POOL * 3 + Fully connected Layer.

Challenges

Convolutional Neural Networks need huge amounts of labeled data and lots of computation power to get trained. They typically take weeks to get trained to achieve state of the art performance. Most of these architectures like AlexNet, ZF Net, VGG Net, GoogLe Net, Microsoft Res Net take weeks to get trained. Does that mean, an organization without huge volumes of data and computation power cannot take advantage of it? The answer is No.

Transfer learning to the rescue

Most of the winners of the ILSVRC (ImageNet Large-Scale Visual Recognition Challenge) competition has open sourced the architecture and the weights associated with these networks. It turns out, most of the weights particularly that of the filters can be reused after fine tuning to the domain specific problems. So for us to take advantage of these convolutional neural networks, all we need to do is pre-train the last few layers of the network. Which in general takes very little data and computation powers. For several of our scenarios, we were able to train models with state of art performance on GPU machines in few minutes to hours.

Conclusion

Apart from use cases like image recognition , CNN is being widely used in various network topology’s like object recognition (What objects are located in images), GAN (A recent breakthrough in helping computers create realistic images), converting low resolution images to high resolution images , in revolutionising health sector in various forms of cancer detection and many more. In recent months there were architectures built for NLP achieving state of art results.

Vishnu Subramanian

Vishnu Subramanian

Practice Head - BigData & Advanced Analytics at Affine

More Posts

Follow Me:
LinkedIn

HowStat – Application of Data Science in Cricket

Data science helps us to extract knowledge or insights from data- either structured or unstructured- by using scientific methods like mathematical or statistical models. In the last two decades, it has been one of the most popular fields with the rise of all big data technologies. A lot of companies have been using recommendation engines to promote their products/suggestions in accordance with users’ interests such as Amazon, Netflix, Google Play. A lot of other applications like image recognition, gaming, or Airline route planning also involves the usage of big data and data science.

Sports is another field which is using data science extensively to improve strategies and predicting match outcomes. Cricket is a sport where machine learning has scope to dive into quite a large outfield. It can go a long way towards suggesting optimal strategies for a team to win a match or a franchise to bid a valuable player.

Suvajit Sen

Suvajit Sen

Senior Business Analyst at Affine Analytics

More Posts

Follow Me:
LinkedIn

Intrusion/Threat Detection Systems

Recently, when I was reading up on Cyber Security & Threat Detection, I came across “The Annual Data Breach Report by Verizon”. The report analyzed thousands of such incidents reported by various companies, public & private organizations which happened over the last couple of years. The report analyzed breaches by firmographics, geographies, industries etc. and found that cyber intrusion is a growing threat to every industry based in every country of the world. The report proves time and again that “No single industry or organization in the world is safe from Cyber Threats”. This piqued my curiosity & we felt that we could use all the goodness of data science to effectively tackle this problem. I designed a Threat/Intrusion Detection System, which could be used to detect such data leaks/breaches & take a preventive action to contain, if not stop the damage due to breach.

Aayush Agrawal

Aayush Agrawal

A data science enthusiast and an entrepreneur working on Marketing ROI Measurement using IOT & Big Data solutions.

More Posts

Follow Me:
LinkedIn