📈 Building a Probabilistic Forecast with Linear Regression 👇🏼 There are many ways to model uncertainty in forecasting; a common method is using prediction intervals for a parametric model, such as linear regression, and conformal prediction when using both parametric and non-parametric models. While those methods are powerful, in some cases, you want to estimate the probability of hitting some value in the future. This is where probabilistic forecasting comes in 🎯. 🔍 What is a probabilistic forecast? A probabilistic forecast provides a distribution of possible future outcomes, not just a single predicted value. Instead of saying “sales next month will be 10,500,” we can say, for example, “there’s a 90% chance sales will fall between 9,800 and 11,300.” This gives decision-makers a clearer picture of risk and uncertainty 💡. 🛠️ How to build a probabilistic forecast with linear regression Even a simple linear regression can generate rich uncertainty estimates. Here’s the workflow: 1️⃣ Fit a linear regression model Start by estimating the relationship between your target variable and predictors. The model produces coefficients along with their standard errors. 2️⃣ Extract the coefficient distributions Under standard assumptions, regression coefficients follow approximately normal distributions centered at their estimated values. These distributions represent our uncertainty about the true parameter values. 3️⃣ Simulate coefficient draws Generate many random samples from each coefficient’s distribution. For each draw, compute a forecast using the simulated coefficients. This gives you an ensemble of possible forecast paths, each representing one plausible future scenario. 4️⃣ Aggregate the forecast distribution Once you have many simulated forecasts: ➡️ Take the median or mean as your point forecast ➡️ Use percentiles (e.g., 5th & 95th, or 10th & 90th) to form prediction intervals These intervals capture both parameter uncertainty and the variability inherent in the data. Probabilistic forecasting is essential for planning under uncertainty. And the beauty is—you don’t need a complex model. Even linear regression can provide meaningful, simulation-based uncertainty estimates that elevate the quality of your insights. The screenshot below shows an example of creating a probabilistic forecast for US monthly natural gas demand. #timeseries #forecasting #datascience
Economic Forecasting Methods
Explore top LinkedIn content from expert professionals.
-
-
Machine learning beats traditional forecasting methods in multi series forecasting. In one of the latest M forecasting competitions, the aim was to advance what we know about time series forecasting methods and strategies. Competitors had to forecast 40k+ time series representing sales for the largest retail company in the world by revenue: Walmart. These are the main findings: ▶️ Performance of ML Methods: Machine learning (ML) models demonstrate superior accuracy compared to simple statistical methods. Hybrid approaches that combine ML techniques with statistical functionalities often yield effective results. Advanced ML methods, such as LightGBM and deep learning techniques, have shown significant forecasting potential. ▶️ Value of Combining Forecasts: Combining forecasts from various methods enhances accuracy. Even simple, equal-weighted combinations of models can outperform more complex approaches, reaffirming the effectiveness of ensemble strategies. ▶️ Cross-Learning Benefits: Utilizing cross-learning from correlated, hierarchical data improves forecasting accuracy. In short, one model to forecast thousands of time series. This approach allows for more efficient training and reduces computational costs, making it a valuable strategy. ▶️ Differences in Performance: Winning methods often outperform traditional benchmarks significantly. However, many teams may not surpass the performance of simpler methods, indicating that straightforward approaches can still be effective. Impact of External Adjustments: Incorporating external adjustments (ie, data based insight) can enhance forecast accuracy. ▶️ Importance of Cross-Validation Strategies: Effective cross-validation (CV) strategies are crucial for accurately assessing forecasting methods. Many teams fail to select the best forecasts due to inadequate CV methods. Utilizing extensive validation techniques can ensure robustness. ▶️ Role of Exogenous Variables: Including exogenous/explanatory variables significantly improves forecasting accuracy. Additional data such as promotions and price changes can lead to substantial improvements over models that rely solely on historical data. Overall, these findings emphasize the effectiveness of ML methods, the value of combining forecasts, and the importance of incorporating external factors and robust validation strategies in forecasting. If you haven’t already, try using machine learning models to forecast your future challenge 🙂 Read the article 👉 https://buff.ly/3O95gQp
-
𝗪𝗵𝗮𝘁 𝗶𝗳 𝘆𝗼𝘂 𝗰𝗼𝘂𝗹𝗱 𝗳𝗼𝗿𝗲𝗰𝗮𝘀𝘁 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗯𝘆 𝘁𝗿𝗲𝗮𝘁𝗶𝗻𝗴 𝗻𝘂𝗺𝗯𝗲𝗿𝘀 𝗹𝗶𝗸𝗲 𝘄𝗼𝗿𝗱𝘀? That's exactly what Amazon did with Chronos. They took T5 (yes, the language model) and taught it to "read" time series data. The trick? Tokenize continuous values into ~4096 discrete bins. Suddenly, forecasting becomes next-token prediction. 𝗘𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻: 📍 Chronos (Feb 2024) — Original release 📍 Chronos-Bolt (Nov 2024) — ~250× faster inference 📍 Chronos 2.0 (Oct 2025) — Multivariate support 𝗛𝗼𝘄 𝗶𝘁 𝘄𝗼𝗿𝗸𝘀: 🔹 𝘛5 𝘌𝘯𝘤𝘰𝘥𝘦𝘳-𝘋𝘦𝘤𝘰𝘥𝘦𝘳 — Bidirectional encoder captures dependencies; autoregressive decoder generates multi-step forecasts 🔹 𝘛𝘰𝘬𝘦𝘯𝘪𝘻𝘢𝘵𝘪𝘰𝘯 — Mean-scale values → quantize into bins → regression becomes classification. Now you can use all the LLM tricks. 🔹 𝘗𝘳𝘰𝘣𝘢𝘣𝘪𝘭𝘪𝘴𝘵𝘪𝘤 𝘖𝘶𝘵𝘱𝘶𝘵 — Outputs a distribution over bins per timestep. Sample → get prediction intervals with calibrated uncertainty. 🔹 𝘊𝘩𝘳𝘰𝘯𝘰𝘴-𝘉𝘰𝘭𝘵 — One-shot decoding (all future timesteps in one forward pass). ~250× speedup + ~5% accuracy gain via knowledge distillation. 𝗣𝗿𝗲𝘁𝗿𝗮𝗶𝗻𝗶𝗻𝗴: • Large corpus: energy, traffic, economics, weather, web traffic • Heavy augmentation: scaling, jittering, warping 𝗪𝗵𝘆 𝗶𝘁 𝗺𝗮𝘁𝘁𝗲𝗿𝘀: ✅ Bridges time series & NLP—use mature LLM infrastructure ✅ Native probabilistic forecasting ✅ Chronos 2.0: multivariate + cross-variable learning ✅ Multiple sizes (Mini → Large) ✅ Apache-2.0 license 𝗣𝗲𝗿𝘀𝗼𝗻𝗮𝗹 𝗻𝗼𝘁𝗲: Syama Sundar Rangapuram is one of the co-authors on this work. He taught me ML during my grad school days and helped me out more than I can say. Seeing his work shape the field like this — super proud. 🙌 The LLM playbook works for time series. Who knew? #TimeSeries #MachineLearning #Forecasting #AI #FoundationModels
-
Here are 5 machine learning algorithms used for FP&A and #finance time series analysis: ✅ ARIMA/SARIMA: Forecast future revenues and expenses by identifying trends and seasonality. ✅ LSTM: Analyze complex patterns in cash flow or sales data to improve financial planning. ✅ Prophet: Handle unpredictable markets and still make reliable forecasts. ✅ GARCH: Assess and predict market volatility to make more informed investment or budgeting decisions. More detail below ↓ 1. ARIMA (Auto-Regressive Integrated Moving Average) ARIMA helps predict future values by analyzing past data to identify patterns like trends or seasonality. For example, you can use ARIMA to forecast next year’s monthly revenue by recognizing historical trends and seasonal variations, such as higher sales during holiday seasons. 2. LSTM (Long Short-Term Memory) Networks LSTM is an artificial intelligence technique that learns from past data and remembers long-term patterns. It can be used in FP&A to forecast cash flow by identifying recurring inflows and outflows over time, like specific project payments or seasonal cash patterns. 3. SARIMA (Seasonal ARIMA) SARIMA extends ARIMA by incorporating seasonality, making it ideal for forecasting data with regular patterns. For example, you can predict quarterly expenses more accurately if certain quarters have consistently higher costs due to contracts or seasonal demand. 4. Prophet Prophet, developed by Facebook, handles missing data and outliers well, making it useful for complex datasets. To get the code and example for implement it, go here: https://lnkd.in/eJKcHzqU You could use Prophet to forecast annual sales even when your data is incomplete or affected by irregular events like economic shifts. 5. GARCH (Generalized Autoregressive Conditional Heteroskedasticity) GARCH models volatility and is great for predicting how much financial data varies over time. You can apply it in FP&A to assess and predict the volatility of stock prices in your investment portfolio, helping in risk management and budgeting.
-
Stock price forecasting is difficult because prices are driven by many external forces, like macroeconomics, policies, company fundamentals, and investor sentiment, which make the series noisy, unstable, and hard to model. Machine- and Deep-Learning (ML/DL) are widely used for stock price analysis due to their higher predictive accuracy compared to traditional statistical and econometric methods. Although DL can capture nonlinear, high-dimensional market patterns, its effectiveness depends on having large datasets. In practice, daily stock data are limited, especially for IPOs, and standard data-augmentation techniques used in computer vision cannot be applied because they break temporal order. This creates a data-scarcity problem that weakens model performance. Beyond limited data, stock prices also contain intertwined components such as trends, cycles, and randomness. Single- or multi-scale decomposition methods break down a signal such as a time series, stock price data, or a sound wave, etc, into components, each representing the signal's behavior at a different scale or level of detail. These subseries, however remain correlated in practice. Existing models treat them independently and ignore these cross-relationships, losing valuable predictive information. To address both data limitations and structural complexity, the authors of [1] proposes a combined TimeGAN + decomposition learning framework ('TimeGAN + SSA + LSTM'). Multi-view market data (open, high, low, close, volume) are first used to train a TimeGAN model, which generates realistic synthetic sequences to expand the dataset. The closing-price series is then decomposed using SSA (singular spectrum analysis) into smoother subseries, and an LSTM extracts temporal features from each. A self-attention mechanism captures the interactions among correlated subseries, and the fused representation is further enhanced by modelling its dependencies with other market-feature series. A final LSTM produces the closing-price prediction. Experiments were conducted on nearly a decade of data from multiple international stock indices: the U.S. S&P 500 (SP500), China’s CSI 300 (CSI300), Japan’s Nikkei 225 (N225), and the U.K.'s FTSE 100 (FTSE100). The results demonstrate that the proposed 'TimeGAN + SSA + LSTM' integrated approach, combining data augmentation, decomposition, and inter-series attention, achieves superior prediction accuracy (RMSE) and superior Sharpe-Ratio (SR) compared to other advanced baselines (BP, LSTM, VMD-LSTM, N-Beats, SCINet, DLinear, MLSF and MASTER). #QuantFinance The link to the paper is available in the comments.
-
What if your 12-month forecast is 80% wrong? That’s not bad forecasting. It’s bad strategy. Too many forecasts are treated like crystal balls: 🔮 Rigid. 🔮 Unrealistic. 🔮 Quickly irrelevant. The truth? Forecasts don’t need to be perfect. They need to be adaptable. Here’s how I build forecasts that empower instead of paralyze: 📊 Rolling forecasts that shift with reality—not once a year, but monthly or quarterly. 🧭 Scenario planning that prepares leaders for the unexpected—not just the most likely case. 📉 Sensitivity analysis that highlights which assumptions actually move the needle. 🛠️ Driver-based models so every number ties to real-world levers. 🧠 And above all: decision-useful insight > spreadsheet perfection. In today’s volatile environment, static forecasts break. Agile finance wins. 💬 How are you building flexibility into your forecasts this year? #CFOInsights #Forecasting #AgileStrategy #ScenarioPlanning #OperationalFinance #FinancialLeadership #StrategyExecution
-
Everyone’s finally saying the quiet part out loud: ✨ We’re at the beginning of a copper supercycle. Now the part that isn’t being said: This *is not* just about AI and data centers. Copper is a socioeconomic material. Nearly 70% of global copper demand already goes to electrical applications. Not GPUs. Not hype. Wires, motors, transformers, transmission lines. And demand is accelerating fast. Most credible forecasts show 40–50% growth in copper demand by 2040, driven by the backbone of what we use every day: Grid expansion and replacement, Electrification of transport and industry, EVs that use 3–4x more copper than combustion vehicles, and Transmission buildout to connect renewables AI didn’t create the copper problem. Our friendly LLMs just made it impossible to ignore. Here’s the scientific reality: Copper has the highest electrical conductivity of any non-precious metal. It’s durable, corrosion-resistant, thermally efficient, and infinitely recyclable. There is no substitute that works at grid scale without major tradeoffs. Physics wins, as it’s been known to do again and again. The uncomfortable part? We’re not *actually* running out of copper. We’re running out of economically recoverable copper. Ore grades have fallen roughly 40% since the 1990s. What’s left is lower grade, more complex, more impure, and often written off as waste under traditional processing models. And yes, Life finds a way, but this has real consequences like higher electricity costs, delayed electrification in rural and underserved communities, more geopolitical concentration of supply and much more energy, water, and emissions per ton produced This is why the next copper cycle looks different. It has to. The winners won’t just be the companies that find new copper. They’ll be the ones that expand supply from what already exists. Because when you unlock stranded copper, you don’t just change unit economics. You change the outcomes that matter to everyone: Faster grid buildout, more stable power prices, new high-skill jobs in mining regions, and lower environmental and social tradeoffs Copper isn’t just an input to AI. It’s an input to equity, resilience, and economic mobility. This cycle isn’t a moment, I believe it’s the beginning of a structural reset, and we’re still early. #ChaoticGood #MicrobesAreHow #RealTalkTimebyLiz #WhatKeepsMeUpAtNight #Copper #Supplychain #Supercycle
-
Copper isn’t just the metal to watch It IS the metal of 2025. Gold’s up. Lithium’s still hot. But if you’re paying attention to what’s actually powering the future, copper is at the center of it all: • Electric vehicles use 3x more copper than gas cars. EV copper demand is projected to jump 555% by 2035. • Clean energy infrastructure is copper-intensive. Solar, wind, grid storage—all of it depends on copper. Battery storage alone? 557% demand growth by 2035. • Digital infrastructure (think AI, data centers, 5G) is the next unexpected driver. By 2050, tech will make up 6% of copper demand—up from 1% today. Meanwhile, supply can’t keep up: • New discoveries are deeper, rarer, and more expensive to develop. • Project approvals now stretch 8–10 years. • Tariff threats drove a premium for copper in the U.S. versus London—an unprecedented shift redirecting global supply flows. To meet net-zero goals, the world needs 61 new copper mines—and we’re nowhere close. That requires $2.1 trillion in capex BEFORE 2050. No wonder copper recently surged ~15% to $10,000/ton (as of April 2025, pre-tariffs). Bank of America sees $10,750 in 2025. And they’re not alone... At Power Metallic, we’re already seeing the shift. The Lion Zone discovery itself came from a spare 200 meters of drilling budget—what started as a “last hole” gamble turned into a major breakthrough once we confirmed the results eight months later. The Lion Zone isn’t just another discovery—it’s part of the solution. If you want the real upside, follow the fundamentals, not the hype. Copper’s the signal. Not the noise.
-
⚡ Is copper the new oil? Global copper demand will surge 66% by 2040 from 23.5M tonnes in 2020 to 39.1M tonnes. If the world sticks to net-zero emissions targets, copper won’t just be a commodity — it’ll be critical infrastructure. At the same time, oil demand could fall 28%, dropping from 91.2M barrels/day in 2020 to 66M b/d by 2040, under the IEA’s net-zero scenario. 🔧 Why copper? EVs, wind turbines, solar grids, transmission lines — all copper-intensive. Energy transition tech is wiring the world, and copper is the backbone. Supply constraints, permitting delays, and underinvestment pose real risks. Meanwhile, the oil world isn’t disappearing — but it is changing: Peak demand may hit sooner than many producers planned for. Hydrocarbon producers are hedging bets with investments in metals and mining (think: Saudi Arabia’s mining pivot or BHP’s green metals focus). 🌍 Strategic implication: We’re entering a commodities supercycle — but this time, the volume leader may be different from the value leader. Oil drove the 20th century. Copper may drive the energy map of the 21st. Are markets ready for that pivot? 📩 DM me if you want to unlock your competitive edge with our tailored commodity insights, personalized reports, and on-demand strategic consulting #Energy #Copper #Oil #Commodities #EVs #CriticalMinerals #Mining
Explore categories
- Hospitality & Tourism
- Productivity
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development