📊 PYTHON + DEEP LEARNING TIP – Forecasting the Future Price of a Product Using an LSTM Neural Network

📊 PYTHON + DEEP LEARNING TIP – Forecasting the Future Price of a Product Using an LSTM Neural Network

📰 Edição #49 — PYTHON + DEEP LEARNING TIP – Forecasting the Future Price of a Product Using an LSTM Neural Network

Article content

🎯 Tip Objective

To demonstrate, in an accessible and robust way, how to use Deep Learning with LSTM neural networks to forecast future prices based on historical data — a powerful technique applicable to the financial, commercial, and strategic context of any company dealing with demand forecasting, pricing, and planning.


🛒 Fictional Scenario

Imagine you work in the finance department of a company that sells the product SuperCoffee 3000 ☕️. This product has daily price fluctuations, and management wants to know: "Based on the behavior of the last 20 days, what should the product cost tomorrow?" Instead of guessing, you decide to use Python + Artificial Intelligence to forecast this value with greater reliability. That’s where Deep Learning with LSTM comes into play.


✅ What does this mean in practice?

  • You have a time series of prices over 300 days.
  • The machine learns the previous price behavior.
  • Then it tries to predict the next day's value based on the last 20.


🧠 Deep Learning Concept (For Beginners)

Deep Learning is a type of Artificial Intelligence based on deep neural networks, inspired by the functioning of the human brain. The machine learns behavioral patterns by processing data such as:

  • 📈 Price history
  • 📉 Sales trends
  • 🛍️ Demand seasonality

In this example, we use an LSTM model (Long Short-Term Memory), ideal for working with sequential and temporal data, like prices over time.


✅ Does this script use Deep Learning?

Yes. And here’s why:

Article content

🧪 How the Script Works – Learning Steps

Below is the complete code with line-by-line comments, ideal for testing in VSCode and presenting to students or a technical team:

# 📦 Importing essential libraries

import numpy as np                      # For array manipulation and random number generation

import pandas as pd                    # To create and manipulate DataFrames

import matplotlib.pyplot as plt        # To generate charts
from sklearn.preprocessing import MinMaxScaler  # For data normalization
import tensorflow as tf                # Main Deep Learning library used here


# 🔧 Function that creates sequences (observation windows) for training the neural network

def criar_sequencias(dados, janela):

    X, y = [], []                      # X = inputs (last 20 days), y = output (next day)

    for i in range(len(dados) - janela):

        X.append(dados[i:i+janela])   # Adds a window of 20 values

        y.append(dados[i+janela])     # Adds the target value (the next day after the window)

    return np.array(X), np.array(y)   # Returns arrays ready for the neural network
 

# 1️⃣ Generating a synthetic time series with 300 days
np.random.seed(42)                     # Set seed for reproducible results

datas = pd.date_range(start="2022-01-01", periods=300)  # Create daily dates from 01/01/2022

precos = np.cumsum(np.random.randn(300) * 2 + 0.5) + 100  # Generate prices with random variation

df = pd.DataFrame({'Data': datas, 'Preço': precos}).set_index('Data')  # Create DataFrame with prices

 
# 2️⃣ Normalizing the data between 0 and 1 (required for neural networks)
scaler = MinMaxScaler()

precos_scaled = scaler.fit_transform(df[['Preço']])  # Normalize the 'Preço' column

 
# 3️⃣ Creating input and output windows for the model (X = 20 days, y = next day)
janela = 20

X, y = criar_sequencias(precos_scaled, janela)       # Apply window function

divisao = int(0.8 * len(X))                          # Split data into training and testing sets (80/20)

X_train, X_test = X[:divisao], X[divisao:]           # Training input

y_train, y_test = y[:divisao], y[divisao:]           # Training output


# 4️⃣ Defining the LSTM model with TensorFlow/Keras
modelo = tf.keras.Sequential([

    tf.keras.Input(shape=(janela, 1)),               # Input: sequence of 20 days (3D format)

    tf.keras.layers.LSTM(50),                        # LSTM layer with 50 neurons

    tf.keras.layers.Dense(1)                         # Output: predict 1 value (next day)

])

modelo.compile(optimizer='adam', loss='mse')         # Compile with Adam optimizer and MSE loss

modelo.fit(X_train, y_train, epochs=20, batch_size=16, verbose=1)  # Train for 20 epochs


# 5️⃣ Making predictions on the test data
y_pred = modelo.predict(X_test)                      # Predict using the trained model

y_pred_inv = scaler.inverse_transform(y_pred)        # Convert predictions back to original scale

y_real_inv = scaler.inverse_transform(y_test)        # Convert actual test values

 
# 6️⃣ Preparing the series for visualization
idx = df.index[-len(y_pred_inv):]                    # Get dates corresponding to the test set

serie_predita = pd.Series(y_pred_inv.flatten(), index=idx)  # Predicted values

serie_real = pd.Series(y_real_inv.flatten(), index=idx)     # Real test values


# 7️⃣ Plotting the comparison chart
plt.figure(figsize=(12, 6))

plt.plot(df.index, df['Preço'], label='Original Price')           # Full price history

plt.plot(serie_real.index, serie_real, '--', label='Real (Test)') # Real values from test

plt.plot(serie_predita.index, serie_predita, '-.', label='Predicted (LSTM)')  # LSTM predictions

plt.title("Price Forecast with LSTM")

plt.xlabel("Date")

plt.ylabel("Price")

plt.legend()

plt.grid(True)

plt.tight_layout()


# 💾 Saving the chart in high resolution
plt.savefig("grafico_previsao_lstm.png", dpi=300)

plt.show()        

 🖼️ Generated Images

Article content
Article content

📊 Chart Analysis

The chart represents the price forecast made by an LSTM model (Long Short-Term Memory), applied to a synthetic time series with an upward trend and controlled noise.


🧪 1. Technical Code Insight – How the Chart is Generated

The final chart is the result of eight steps in the Python code, combining data simulation, preprocessing, Deep Learning modeling, and visualization.


🧩 2. Step-by-Step Breakdown

Article content

📈 3. Chart Interpretation

  • 🔵 Blue Line (Original Price): full historical series
  • 🟠 Orange Dashed Line (Real - Test): last 20% used for validation
  • 🟢 Green Dotted Line (Predicted - LSTM): model’s forecast curve


✅ 4. Highlighted Technical Points

Article content

🔷 5. Displayed Curves

  • 🔵 Original Price (blue): simulated historical data
  • 🟠 Real (Test) (orange dashed): real test values
  • 🟢 Predicted (LSTM) (green dotted): LSTM output


✅ 6. Overall Trend Successfully Captured

  • The LSTM effectively identifies the main upward trend.
  • The predicted curve follows the overall direction, especially after August 2022.


🧠 7. Natural Smoothing from LSTM

  • The green curve exhibits natural smoothing, typical of LSTMs: Suppresses noise Prioritizes learned patterns
  • Great for trend forecasting, but not for sharp peaks.


🎯 8. Excellent Error Reduction

  • Final loss (≈ 5e-04) confirms good visual alignment
  • Even during acceleration, prediction remains close to reality


📊 Learning Accuracy

How much did the machine actually learn?

We can measure it with the R² Score, which tells us how close the forecast is to reality:

from sklearn.metrics import r2_score

r2 = r2_score(y_real_inv, y_pred_inv)

print(f"Learning Accuracy (R²): {r2*100:.2f}%")


📊 What is “Machine Learning Accuracy”?

In this script context:

  • Learning is measured using the loss function: mean squared error (mse)
  • At each epoch, the model adjusts the weights based on its error
  • The final percentage reflects how well the model can predict actual values


🔍 Where Deep Learning Operates in the Code

Article content

🛠️ Practical Applications

  • Dynamic pricing of fast-moving products
  • Promotion planning based on value forecasts
  • Early detection of spikes or drops in strategic items


💡 Final Recommendations

  • LSTM forecasting works well for financial series with regular patterns.
  • For volatile environments (e.g., crypto), consider: Adding technical indicators Hybrid models (LSTM + Attention) Tuning the time window


🧩 Final Reflection

You don’t need to master calculus or linear algebra to use Deep Learning. You just need to understand the problem, choose the right approach, and let AI work for you.


📅 Call to Action

Want to turn your graphs into powerful visual tools? Explore more tips like this one:

💼 LinkedIn & Newsletters:

👉 https://www.garudax.id/in/izairton-oliveira-de-vasconcelos-a1916351/

👉 https://www.garudax.id/newsletters/scripts-em-python-produtividad-7287106727202742273

👉 https://www.garudax.id/build-relation/newsletter-follow?entityUrn=7319069038595268608

💼 Company Page:

👉 https://www.garudax.id/company/106356348/

💻 GitHub:

👉 https://github.com/IOVASCON


🏷️ Hashtags

#Python #Matplotlib #TensorFlow #DeepLearning

Great insights, IZAIRTON! LSTM is a powerful tool for forecasting e-commerce pricing trends. For teams working on similar AI models, ensuring access to real-time, diverse data is crucial. Tools like NetNut.io can support this by providing reliable data streams to refine your models. Thanks for sharing!

To view or add a comment, sign in

More articles by Izairton Vasconcelos

Others also viewed

Explore content categories