🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 77 (from laksa150)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

📄
INDEXABLE
CRAWLED
3 months ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH3.8 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://medium.com/@ai_academy/time-series-forecasting-lesson-1-exponential-smoothing-eb55ecbb679d
Last Crawled2025-12-16 18:27:24 (3 months ago)
First Indexednot set
HTTP Status Code200
Meta TitleTime Series Forecasting Lesson 1: Exponential Smoothing | by Debasish Dutta | Medium
Meta DescriptionTime Series Forecasting Lesson 1: Exponential Smoothing Time Series Forecasting: Time Series is a time stamped data. Time-series forecasting assumes that a time series is a combination of a pattern …
Meta Canonicalnull
Boilerpipe Text
Time Series Forecasting: Time Series is a time stamped data. Time-series forecasting assumes that a time series is a combination of a pattern and some random error. The goal is to separate the pattern from the error by understanding the pattern’s trend, its long-term increase or decrease, and its seasonality, the change caused by seasonal factors such as fluctuations in use and demand. Several methods of time series forecasting are available such as the Moving Averages method, Linear Regression with Time, Exponential Smoothing etc. Types of Forecasting Models: 1. Explanatory Models: Explanatory models assume that the variable to be forecast exhibits an explanatory relationship with one or more other variables. For example, we may model the Policy Count (PC) of a particular agent as PC = f (Type of Industry, Geography, past experience, error): The relationship is not exact. The “error” term on the right allows for random variation and the effects of relevant variables not included in the model. Models in this class include regression models, additive models, and some kinds of neural networks. The purpose of the explanatory model is to describe the form of the relationship and use it to forecast future values of the forecast variable. Under this model, any change in inputs will affect the output of the system in a predictable way, assuming that the explanatory relationship does not change. 2. Time Series Models: In contrast, Time series forecasting uses only information on the variable to be forecast, and makes no attempt to discover the factors affecting its behavior. For example, PC t+1 = f (PC t ; PC t-1 ; PC t-2 ; PC t-3 ;: : : ; error); Where t is the present week, t + 1 is the next week, t-1 is the previous week; t-2 is two weeks ago, and so on. Here, prediction of the future is based on past values of variable and/or past errors, but not on explanatory variables which may affect the system. Time series models used for forecasting include ARIMA models, exponential smoothing and structural models. Components of Time Series: 1. Trend: Trend is a long-term movement in a time series. It is the underlying direction (an upward or downward tendency) and rate of change in a time series. 2. Seasonality: Seasonality is defined to be the tendency of time-series data to exhibit behavior that repeats itself every L periods. The term season is used to represent the period of time before behavior begins to repeat itself. L is therefore the season length in periods. 3. Cyclicity: When data shows some reputation of pattern but not at fixed intervals, it is called Cyclicity. Cyclicity can be defined as long wave swings, whereas seasonality is generally defined as annual periodicity within a time series. Cycles involve deviations from trends or equilibrium levels. Smoothing Techniques: Smoothing techniques are used to reduce irregularities (random fluctuations) in time series data. They provide a clearer view of the true underlying behavior of the series. In some time series, seasonal variation is so strong it obscures any trends or cycles which are very important for the understanding of the process being observed. Smoothing can remove seasonality and makes long term fluctuations in the series stand out more clearly. 1. Moving Averages: A moving average is simply a numerical average of N data points. There are several types of moving average techniques: a. Simple Moving Average: It requires odd number of observations for the middle value to be averaged. It takes into account a fixed count of past observations and calculates the numerical average. Formula: MA t+1 =( Y t + Y t-1 + … + Y t-n+1 ) / n. b. Double Moving Average: The simple moving average method is intended for the data having no trend. But if trend is significant in a data then SMA will be misleading. Then, double moving average comes into play and we get the correct results. In double moving average, we consider the averages calculated by simple moving average as data points and again calculate their averages. c. Centered Moving Average: As we have already mentioned that for a simple moving average odd count of observations are needed. But if we need to calculate moving average including even number of observations at a time then we need this methodology. It is specified as “m X n MA”. d. Weighted Moving Average: In weighted moving averages, we put weights to past observations in order to smooth the data. Weighted MA(3) can be expressed as, Weighted MA (3) = w1.Y t +w2.Y t-1 +w3.Y t-2 One of the methods of calculating weights is, w1 = 3/(1 + 2 + 3) = 3/6, w2 = 2/6, and w3 = 1/6 2. Exponential Smoothing Methods: Exponential smoothing is a procedure for assigning exponentially decreasing weights as the observation get older. In other words, recent observations are given relatively more weight in forecasting than the older observations. a. Simple Exponential Smoothing Method: Simple smoothing is used for short-range forecasting, usually just one month into the future. The model assumes that the data fluctuates around a reasonably stable mean (no trend or consistent pattern of growth). When applied recursively to each successive observation in the series, each new smoothed value (forecast) is computed as the weighted average of the current observation and the previous smoothed observation; the previous smoothed observation was computed in turn from the previous observed value and the smoothed value before the previous observation, and so on. Formally, the exponential smoothing equation is: Where, F t+1 =Forecast for the next period Ft= Old forecast for period t y t = Observed value of series in period t a = Smoothing constant The selection of alpha (which varies from 0 to 1) has considerable impact on the forecasts. If stable predictions with smoothed random variation are desired, then a small value of alpha is desired. If a rapid response to a real change in the pattern of observations is desired, a large value of alpha is appropriate. In general, we give different values to alpha (0.1, 0.2, 0.3, 0.4, 0.5… 0.9) and make a grid of RMSE values. Then the alpha value corresponding to minimum RMSE is selected as final weight to be assigned. Once alpha value is selected we need the initialization of forecast, which can be in 2 ways: Either the first value of Y is taken as initial value, OR We take average of first four-five values and consider it. b. Double Exponential Smoothing Method (Holt) / Holt’s Exponential Smoothing: This method is used when the data shows a trend. Exponential smoothing with a trend works much like simple smoothing except that two components must be updated each period — level and trend. The level is a smoothed estimate of the value of the data at the end of each period. The trend is a smoothed estimate of average growth at the end of each period. The exponentially smoothed series or current level estimate. The trend estimate. Forecast m periods into the future. Where, Lt = Estimate of the level of the series at time t a = smoothing constant for the data. yt = new observation or actual value of series in period t. b = smoothing constant for trend estimate bt = estimate of the slope of the series at time t m = periods to be forecast into the future. The weight alpha and beta can be selected subjectively from gridding as explained above by minimizing a measure of forecast error such as RMSE. Large weights result in more rapid changes in the component. Small weights result in less rapid changes. c. Triple Exponential Smoothing Method (Winter) / Winter’s Exponential Smoothing: This method is used when the data shows trend and seasonality. To handle seasonality, we have to add a third parameter. We now introduce a third equation to take care of seasonality. The resulting set of equations is called the “Holt-Winters” method after the names of the inventors. There are two main HW models, depending on the type of seasonality: Multiplicative Seasonal Model and Additive Seasonal Model. There are three levels in all: Level: Trend: Seasonality: Forecast m period into the future: Where, L t = level of series. a = smoothing constant for the data. y t = new observation or actual value in period t. b = smoothing constant for trend estimate. b t = trend estimate. g = smoothing constant for seasonality estimate. S t =seasonal component estimate. m = Number of periods in the forecast lead period. s = length of seasonality (number of periods in the season) F t+m = forecast for m periods into the future. Classical Decomposition of Time Series It is a relatively simple procedure and forms the basis for most other methods of time series decomposition. There are two forms of classical decomposition: an additive decomposition and a multiplicative decomposition. These are described below for a time series with seasonal period m (e.g., m=4 for quarterly data, m=12 for monthly data, m=7 for daily data with a weekly pattern). Additive decomposition: 1. If mm is an even number, compute the trend-cycle component using a 2×m-MA to obtain T^t. If m is an odd number, compute the trend-cycle component using an m-MA to obtain T^t. 2. Calculate the detrended series: yt−T^t. 3. To estimate the seasonal component for each month, simply average the detrended values for that month. For example, the seasonal index for March is the average of all the detrended March values in the data. These seasonal indexes are then adjusted to ensure that they add to zero. The seasonal component is obtained by stringing together all the seasonal indices for each year of data. This gives S^t. 4. The remainder component is calculated by subtracting the estimated seasonal and trend-cycle components: E^t=yt−T^t−S^t. Multiplicative decomposition 1. If m is an even number, compute the trend-cycle component using a 2×m MA to obtain T^t. If m is an odd number, compute the trend-cycle component using an m-MA to obtain T^t. 2. Calculate the detrended series: yt/T^t. 3. To estimate the seasonal component for each month, simply average the detrended values for that month. For example, the seasonal index for March is the average of all the detrended March values in the data. These seasonal indexes are then adjusted to ensure that they add to mm. The seasonal component is obtained by stringing together all the seasonal indices for each year of data. This gives S^t. 4. The remainder component is calculated by dividing out the estimated seasonal and trend-cycle components: E^t=yt/(T^tS^t). While classical decomposition is still widely used, it is not recommended. Some of the problems with classical decomposition are summarized below. The estimate of the trend is unavailable for the first few and last few observations. For example, if m=12m=12, there is no trend estimate for the first six and last six observations. Consequently, there is also no estimate of the remainder component for the same time periods. Classical decomposition methods assume that the seasonal component repeats from year to year. For many series, this is a reasonable assumption, but for some longer series it is not. For example, electricity demand patterns have changed over time as air conditioning has become more widespread. So, in many locations, the seasonal usage pattern from several decades ago had maximum demand in winter (due to heating), while the current seasonal pattern has maximum demand in summer (due to air conditioning). The classical decomposition methods are unable to capture these seasonal changes over time. Classical Decomposition is not robust enough for outlier values.
Markdown
[Sitemap](https://medium.com/sitemap/sitemap.xml) [Open in app](https://play.google.com/store/apps/details?id=com.medium.reader&referrer=utm_source%3DmobileNavBar&source=post_page---top_nav_layout_nav-----------------------------------------) Sign up [Sign in](https://medium.com/m/signin?operation=login&redirect=https%3A%2F%2Fmedium.com%2F%40ai_academy%2Ftime-series-forecasting-lesson-1-exponential-smoothing-eb55ecbb679d&source=post_page---top_nav_layout_nav-----------------------global_nav------------------) [Medium Logo](https://medium.com/?source=post_page---top_nav_layout_nav-----------------------------------------) [Write](https://medium.com/m/signin?operation=register&redirect=https%3A%2F%2Fmedium.com%2Fnew-story&source=---top_nav_layout_nav-----------------------new_post_topnav------------------) [Search](https://medium.com/search?source=post_page---top_nav_layout_nav-----------------------------------------) Sign up [Sign in](https://medium.com/m/signin?operation=login&redirect=https%3A%2F%2Fmedium.com%2F%40ai_academy%2Ftime-series-forecasting-lesson-1-exponential-smoothing-eb55ecbb679d&source=post_page---top_nav_layout_nav-----------------------global_nav------------------) ![](https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png) # Time Series Forecasting Lesson 1: Exponential Smoothing [![Debasish Dutta](https://miro.medium.com/v2/resize:fill:64:64/1*DFXGsEjL2OA6J4d2lggKgA.jpeg)](https://medium.com/@ai_academy?source=post_page---byline--eb55ecbb679d---------------------------------------) [Debasish Dutta](https://medium.com/@ai_academy?source=post_page---byline--eb55ecbb679d---------------------------------------) 9 min read · Oct 8, 2025 \-- Listen Share Time Series Forecasting: Time Series is a time stamped data. Time-series forecasting assumes that a time series is a combination of a pattern and some random error. The goal is to separate the pattern from the error by understanding the pattern’s trend, its long-term increase or decrease, and its seasonality, the change caused by seasonal factors such as fluctuations in use and demand. Several methods of time series forecasting are available such as the Moving Averages method, Linear Regression with Time, Exponential Smoothing etc. Types of Forecasting Models: 1\. Explanatory Models: Explanatory models assume that the variable to be forecast exhibits an explanatory relationship with one or more other variables. For example, we may model the Policy Count (PC) of a particular agent as PC = f (Type of Industry, Geography, past experience, error): The relationship is not exact. The “error” term on the right allows for random variation and the effects of relevant variables not included in the model. Models in this class include regression models, additive models, and some kinds of neural networks. The purpose of the explanatory model is to describe the form of the relationship and use it to forecast future values of the forecast variable. Under this model, any change in inputs will affect the output of the system in a predictable way, assuming that the explanatory relationship does not change. 2\. Time Series Models: In contrast, Time series forecasting uses only information on the variable to be forecast, and makes no attempt to discover the factors affecting its behavior. For example, PC*t+1* = f (PC*t*; PC*t-1*; PC*t-2*; PC*t-3*;: : : ; error); Where t is the present week, t + 1 is the next week, t-1 is the previous week; t-2 is two weeks ago, and so on. Here, prediction of the future is based on past values of variable and/or past errors, but not on explanatory variables which may affect the system. Time series models used for forecasting include ARIMA models, exponential smoothing and structural models. Components of Time Series: 1\. Trend: Trend is a long-term movement in a time series. It is the underlying direction (an upward or downward tendency) and rate of change in a time series. 2\. Seasonality: Seasonality is defined to be the tendency of time-series data to exhibit behavior that repeats itself every L periods. The term season is used to represent the period of time before behavior begins to repeat itself. L is therefore the season length in periods. Press enter or click to view image in full size ![]() 3\. Cyclicity: When data shows some reputation of pattern but not at fixed intervals, it is called Cyclicity. Cyclicity can be defined as long wave swings, whereas seasonality is generally defined as annual periodicity within a time series. Cycles involve deviations from trends or equilibrium levels. Smoothing Techniques: Smoothing techniques are used to reduce irregularities (random fluctuations) in time series data. They provide a clearer view of the true underlying behavior of the series. In some time series, seasonal variation is so strong it obscures any trends or cycles which are very important for the understanding of the process being observed. Smoothing can remove seasonality and makes long term fluctuations in the series stand out more clearly. 1\. Moving Averages: A moving average is simply a numerical average of N data points. There are several types of moving average techniques: a. Simple Moving Average: It requires odd number of observations for the middle value to be averaged. It takes into account a fixed count of past observations and calculates the numerical average. Formula: MA*t+1* =( Y*t* + Y*t-1* + … + Y*t-n+1*) / n. ![]() b. Double Moving Average: The simple moving average method is intended for the data having no trend. But if trend is significant in a data then SMA will be misleading. Then, double moving average comes into play and we get the correct results. In double moving average, we consider the averages calculated by simple moving average as data points and again calculate their averages. c. Centered Moving Average: As we have already mentioned that for a simple moving average odd count of observations are needed. But if we need to calculate moving average including even number of observations at a time then we need this methodology. It is specified as “m X n MA”. ![]() d. Weighted Moving Average: In weighted moving averages, we put weights to past observations in order to smooth the data. Weighted MA(3) can be expressed as, Weighted MA (3) = w1.Y*t*\+w2.Y*t-1*\+w3.Y*t-2* One of the methods of calculating weights is, w1 = 3/(1 + 2 + 3) = 3/6, w2 = 2/6, and w3 = 1/6 2\. Exponential Smoothing Methods: Exponential smoothing is a procedure for assigning exponentially decreasing weights as the observation get older. In other words, recent observations are given relatively more weight in forecasting than the older observations. a. Simple Exponential Smoothing Method: Simple smoothing is used for short-range forecasting, usually just one month into the future. The model assumes that the data fluctuates around a reasonably stable mean (no trend or consistent pattern of growth). When applied recursively to each successive observation in the series, each new smoothed value (forecast) is computed as the weighted average of the current observation and the previous smoothed observation; the previous smoothed observation was computed in turn from the previous observed value and the smoothed value before the previous observation, and so on. Formally, the exponential smoothing equation is: ![]() Where, F*t+1*\=Forecast for the next period Ft= Old forecast for period t y*t* = Observed value of series in period t **a**\= Smoothing constant The selection of alpha (which varies from 0 to 1) has considerable impact on the forecasts. If stable predictions with smoothed random variation are desired, then a small value of alpha is desired. If a rapid response to a real change in the pattern of observations is desired, a large value of alpha is appropriate. In general, we give different values to alpha (0.1, 0.2, 0.3, 0.4, 0.5… 0.9) and make a grid of RMSE values. Then the alpha value corresponding to minimum RMSE is selected as final weight to be assigned. Once alpha value is selected we need the initialization of forecast, which can be in 2 ways: Either the first value of Y is taken as initial value, OR We take average of first four-five values and consider it. b. Double Exponential Smoothing Method (Holt) / Holt’s Exponential Smoothing: This method is used when the data shows a trend. Exponential smoothing with a trend works much like simple smoothing except that two components must be updated each period — level and trend. The level is a smoothed estimate of the value of the data at the end of each period. The trend is a smoothed estimate of average growth at the end of each period. The exponentially smoothed series or current level estimate. ![]() The trend estimate. ![]() Forecast m periods into the future. ![]() Where, Lt = Estimate of the level of the series at time t a = smoothing constant for the data. yt = new observation or actual value of series in period t. b = smoothing constant for trend estimate bt = estimate of the slope of the series at time t m = periods to be forecast into the future. The weight alpha and beta can be selected subjectively from gridding as explained above by minimizing a measure of forecast error such as RMSE. Large weights result in more rapid changes in the component. Small weights result in less rapid changes. c. Triple Exponential Smoothing Method (Winter) / Winter’s Exponential Smoothing: This method is used when the data shows trend and seasonality. To handle seasonality, we have to add a third parameter. We now introduce a third equation to take care of seasonality. The resulting set of equations is called the “Holt-Winters” method after the names of the inventors. There are two main HW models, depending on the type of seasonality: Multiplicative Seasonal Model and Additive Seasonal Model. There are three levels in all: Level: ![]() Trend: ![]() Seasonality: ![]() Forecast *m* period into the future: ![]() Where, L*t* = level of series. a = smoothing constant for the data. y*t* = new observation or actual value in period t. b = smoothing constant for trend estimate. b*t* = trend estimate. g = smoothing constant for seasonality estimate. S*t* =seasonal component estimate. m = Number of periods in the forecast lead period. s = length of seasonality (number of periods in the season) F*t+m*\= forecast for m periods into the future. Classical Decomposition of Time Series It is a relatively simple procedure and forms the basis for most other methods of time series decomposition. There are two forms of classical decomposition: an additive decomposition and a multiplicative decomposition. These are described below for a time series with seasonal period m (e.g., m=4 for quarterly data, m=12 for monthly data, m=7 for daily data with a weekly pattern). Additive decomposition: 1\. If mm is an even number, compute the trend-cycle component using a 2×m-MA to obtain T^t. If m is an odd number, compute the trend-cycle component using an m-MA to obtain T^t. 2\. Calculate the detrended series: yt−T^t. 3\. To estimate the seasonal component for each month, simply average the detrended values for that month. For example, the seasonal index for March is the average of all the detrended March values in the data. These seasonal indexes are then adjusted to ensure that they add to zero. The seasonal component is obtained by stringing together all the seasonal indices for each year of data. This gives S^t. 4\. The remainder component is calculated by subtracting the estimated seasonal and trend-cycle components: E^t=yt−T^t−S^t. Multiplicative decomposition 1\. If m is an even number, compute the trend-cycle component using a 2×m MA to obtain T^t. If m is an odd number, compute the trend-cycle component using an m-MA to obtain T^t. 2\. Calculate the detrended series: yt/T^t. 3\. To estimate the seasonal component for each month, simply average the detrended values for that month. For example, the seasonal index for March is the average of all the detrended March values in the data. These seasonal indexes are then adjusted to ensure that they add to mm. The seasonal component is obtained by stringing together all the seasonal indices for each year of data. This gives S^t. 4\. The remainder component is calculated by dividing out the estimated seasonal and trend-cycle components: E^t=yt/(T^tS^t). While classical decomposition is still widely used, it is not recommended. Some of the problems with classical decomposition are summarized below. The estimate of the trend is unavailable for the first few and last few observations. For example, if m=12m=12, there is no trend estimate for the first six and last six observations. Consequently, there is also no estimate of the remainder component for the same time periods. Classical decomposition methods assume that the seasonal component repeats from year to year. For many series, this is a reasonable assumption, but for some longer series it is not. For example, electricity demand patterns have changed over time as air conditioning has become more widespread. So, in many locations, the seasonal usage pattern from several decades ago had maximum demand in winter (due to heating), while the current seasonal pattern has maximum demand in summer (due to air conditioning). The classical decomposition methods are unable to capture these seasonal changes over time. Classical Decomposition is not robust enough for outlier values. [Time Series Analysis](https://medium.com/tag/time-series-analysis?source=post_page-----eb55ecbb679d---------------------------------------) [Time Series Forecasting](https://medium.com/tag/time-series-forecasting?source=post_page-----eb55ecbb679d---------------------------------------) [Exponential Smoothing](https://medium.com/tag/exponential-smoothing?source=post_page-----eb55ecbb679d---------------------------------------) [Holt Winters](https://medium.com/tag/holt-winters?source=post_page-----eb55ecbb679d---------------------------------------) \-- \-- [![Debasish Dutta](https://miro.medium.com/v2/resize:fill:96:96/1*DFXGsEjL2OA6J4d2lggKgA.jpeg)](https://medium.com/@ai_academy?source=post_page---post_author_info--eb55ecbb679d---------------------------------------) [![Debasish Dutta](https://miro.medium.com/v2/resize:fill:128:128/1*DFXGsEjL2OA6J4d2lggKgA.jpeg)](https://medium.com/@ai_academy?source=post_page---post_author_info--eb55ecbb679d---------------------------------------) [Written by Debasish Dutta](https://medium.com/@ai_academy?source=post_page---post_author_info--eb55ecbb679d---------------------------------------) [30 followers](https://medium.com/@ai_academy/followers?source=post_page---post_author_info--eb55ecbb679d---------------------------------------) ·[1 following](https://medium.com/@ai_academy/following?source=post_page---post_author_info--eb55ecbb679d---------------------------------------) Data Science + Generative AI + Azure MLOps \|\| 12+ Years \|\| Financial Services (Retail Banking, Wealth & Asset Management) + Retail (FMCG, CPG) ## No responses yet [Help](https://help.medium.com/hc/en-us?source=post_page-----eb55ecbb679d---------------------------------------) [Status](https://status.medium.com/?source=post_page-----eb55ecbb679d---------------------------------------) [About](https://medium.com/about?autoplay=1&source=post_page-----eb55ecbb679d---------------------------------------) [Careers](https://medium.com/jobs-at-medium/work-at-medium-959d1a85284e?source=post_page-----eb55ecbb679d---------------------------------------) [Press](mailto:pressinquiries@medium.com) [Blog](https://blog.medium.com/?source=post_page-----eb55ecbb679d---------------------------------------) [Privacy](https://policy.medium.com/medium-privacy-policy-f03bf92035c9?source=post_page-----eb55ecbb679d---------------------------------------) [Rules](https://policy.medium.com/medium-rules-30e5502c4eb4?source=post_page-----eb55ecbb679d---------------------------------------) [Terms](https://policy.medium.com/medium-terms-of-service-9db0094a1e0f?source=post_page-----eb55ecbb679d---------------------------------------) [Text to speech](https://speechify.com/medium?source=post_page-----eb55ecbb679d---------------------------------------)
Readable Markdownnull
Shard77 (laksa)
Root Hash13179037029838926277
Unparsed URLcom,medium!/@ai_academy/time-series-forecasting-lesson-1-exponential-smoothing-eb55ecbb679d s443