Dynamic Pricing: A Deep Dive into Our Model

Table of contents
Partner with
Aimpoint Digital
Meet an Expert

As part of our series on conducting dynamic pricing using Sigma and Databricks, we want to give more technical readers a deeper dive into how the model is configured within Databricks. In our first blog, we discussed the pervasiveness of dynamic pricing, which is used across industries, such as the fluctuating cost of airline tickets and surge pricing on rideshare apps. Dynamic pricing can boost revenue and prevent supply chain issues, ensuring products are priced appropriately to reduce holding costs and avoid stockouts.

With new features and functionality in Databricks and Sigma exposing dynamic pricing models to business users has never been easier.

In this blog, we will cover the more technical aspects of our dynamic pricing model, including:

  1. Architecture Overview
  2. Modeling Price Elasticity: Unlocking the Power of Causal Inference
  3. Generating Sales Volume Forecasts Using Deep Learning
  4. Deploying Our Custom Model Using Databricks Serving Endpoints
  5. Creating a UDF to Call AI Query

Architecture Overview

At the heart of any dynamic pricing model is clean sales data, which we have processed and stored in the table SALES_DATA. This data is fed into two models: the Causal Price Elasticity Model and the Deep Learning Forecast Model.

The Causal Price Elasticity Model estimates the price elasticity for each product and saves the results to a table in a volume called PRODUCT_ELASTICITIES. Simultaneously, the Deep Learning Forecast Model produces an unbiased seven-day baseline forecast for each product. Finally, both models are combined in our Dynamic Pricing Model Endpoint, where the unbiased forecasts are adjusted based on both the requested price discount and the estimated product price elasticity.

Databricks Dynamic Pricing Model Architecture[
Figure 1: Databricks Dynamic Pricing Model Architecture

Modeling Price Elasticity: Unlocking the Power of Causal Inference

The classic method of computing the price-elasticity of demand is to run a price test. What happens when this is either too costly or time-consuming? We can utilize advancements in causal inference to uncover the true price elasticity of our products in the face of confounding variables.

A confounding variable is one that influences the relationship between both price and demand, which, when ignored, causes our models to produce biased estimates of elasticity.

Imagine this situation: You work for an ice cream company and want to isolate the effect of promotions on ice cream sales. The problem is that promotional activity increases in the summer while seasonal demand for ice cream also increases. Using causal inference models like Double Machine Learning (DML) we can leverage the natural variation in the data to find the causal impact of promotions on demand.

In our demo, our confounding variable was quality. Different products were produced at varying levels of quality, which influenced their price elasticity.

Elasticity model where log_y is the log of sales, log_T is the log of price, X are our product attributes, W are seasonality features
Code Block 1: Elasticity model where log_y is the log of sales, log_T is the log of price, X are our product attributes, W are seasonality features

Generating Sales Volume Forecasts Using Deep Learning

In time series forecasting, we seek to understand trends and seasonality in historical data to predict future values (e.g. ice cream sales, number of airline passengers, etc.). While traditional statistical forecasting techniques like ARIMA (Autoregressive Integrated Moving Average) and regression analysis work well for some situations, deep learning techniques tend to outperform them in being adaptive to changes in the market and environment.

For the purposes of our demo, we leveraged a neural network Pytorch model called NHITS to create a time series forecast to predict our daily sales by product for the next 7 days. The code below illustrates how to train an NHITS model. First, we initialized the NHITS model. Then we configured the NeuralForecast class with the model and fit it to the historical SALES_DATA table. In this model, we used a combination of static and historical exogenous variables. The model will make the final product sales predictions based on the last 35 days of data.

NHITS Forecasting
Code Block 2: NHITS Forecasting if you are interested in using NeuralForecast here is a link to the documentation

Deploying Our Custom Model Using Databricks Serving Endpoints

By leveraging Databricks serverless compute endpoints, we can ensure that our model’s latency and availability requirements are met. Once the custom model is registered, we can simply deploy it using the model serving UI, giving us a reliable model endpoint that is up and running in less than an hour. In the section below, we will outline the steps to create your custom model endpoint.

1. Create Custom Model Class

We defined a class for our custom model that will be registered in MLflow. This class will act as a wrapper for our dynamic pricing model. In this model class, we have a post-processing method where we adjust the product sales forecast based on the discount. In addition, we defined the desired output format to be a string of JSON format so that it is compatible with the Sigma parsing capabilities.

Pyfunc Model Class
Code Block 3: Pyfunc Model Class

2. Register the Model Using MLflow

Once we have created the model, we need to register it to MLflow’s Model Registry. This code also logs model artifacts, including the Python package dependencies needed to recreate the model. Since we are creating a custom model, we used the MLflow Pyfunc model wrapper, a standardized format for our custom DS model.

Model Registration in Unity Catalog
Code Block 4: Model Registration in Unity Catalog

3. Deploy the Model Serving Endpoint via the UI

Using the Model Serving UI, we selected the following:

  • The model endpoint name
  • The location of our registered model in Unity Catalog
  • The model version
  • The compute type as “CPU”
  • The compute scale-out as “Small”
  • The “Scale to Zero” option

Please note that scale to zero is ideal for demo purposes so you do not incur costs while the endpoint is not being used, but it does require a warmup period for the initial requests. If you want to demo your endpoint live, either send warmup requests in advance of the demo so it is up and running or uncheck the “Scale to Zero” option.

Model Serving UI
Figure 2: Model Serving UI

Creating a UDF to Call AI_QUERY

AI_QUERY is a powerful Databricks SQL function that enables us to make a request to our custom model endpoint via an SQL query. In Unity Catalog, we created a custom user-defined function (UDF) to use AI_QUERY to make calling the model endpoint easily accessible on the Sigma side.

The code below demonstrates how we created a custom UDF that includes AI_QUERY. In the AI_QUERY portion of the code, we specified the model endpoint name, request schema, and response schema.

Custom UDF
Code Block 5: Custom UDF

Moving over to Sigma, we created a custom function that calls the UDF we just created in the Unity Catalog. Utilizing input tables in Sigma, we integrated this function to give end users the ability to query our custom dynamic pricing model.

Custom Function in Sigma
Figure 3: Custom Function in Sigma

For more information on Sigma and its write back capabilities, please check out our Sigma & Databricks: Connecting Dashboards and Models with User Input Data blog post to learn how to integrate your Sigma dashboard and models with user input seamlessly.

Supercharge Your Current Pricing Process With Databricks

Reach out to our Aimpoint Digital team where our experts are equipped to help you navigate your most complex data challenges to build actionable solutions. We partner with your team to accelerate your strategic vision through data and analytics by rapidly developing, refining, and deploying actionable analytics applications like this one.

Author
Edward Valentine
Edward Valentine
Lead Data Scientist
Read Bio
Elizabeth Khan
Elizabeth Khan
Principal Machine Learning Engineer
Read Bio

Let’s talk data.
We’ll bring the solutions.

Whether you need advanced AI solutions, strategic data expertise, or tailored insights, our team is here to help.

Meet an Expert