Simulation is better than the real thing

Simulation is better than the real thing

Annually I visit Chalmers University of Technology leading a guest lecture in Automotive Modelling and Simulation. Ever since my studies in vehicle dynamics in the very early nineties, I fostered a passion for simulation. There are a few reasons for conducting systems simulation; marginal cost compared to physical prototypes, repeatability of tests, measure the immeasurable and safety by means of avoiding possible dangerous maneuvers.

Looking back, it strikes me that simulation methods often were deployed first when quality issues arose. Mathematically modelling than created an improved understanding of the issue at hand and became a great contributor to the solution.

This year, the students were asked to fill out a questionnaire prior the lecture. One open-question answer hurt; “Prototype vehicles produce more accurate results than simulation.”

Please read that again. All of us that have experience with prototype vehicles know how many unknowns there are to deal with. Knowing the status of all the in-going parts and sub-systems is a huge task to administer. On top of that, measurement equipment configuration, tolerances, signal noise levels, filter characteristics and sample frequencies must be managed. Indeed, you have a (one) sample, but you do not know where in the spectrum.

In contrast, simulation models can be brought to the right and arbitrary fidelity level for the questions to investigate and requirements to verify. The simulation engineers are in full control of the actual test cases, systems’ configuration and results. Simulation engineers do know which sample in the spectrum is used.

Fidelity levels can be distributed; only one sub-system can be brought to a high-fidelity level and the surrounding systems at a lower but adequate fidelity level. In short; use purpose driven fidelity of models and use purpose driven integration platforms. The latter is plural; one single integration platform across the board will eventually become an extinct dinosaur. Supporting these eco-systems of models require the use of standardized API’s, like FMI.

Of course, prototype sub-systems and vehicles are still part of the development cycle. The prototypes are used to verify and validate the simulation models and validate the requirements. The models become excellent knowledge carriers if the laws of physics are obeyed and magic tuning parameters are omitted. Possible discrepancies between simulation and measurements must thus be physically and mathematically well understood.

If we aim to predict ‘exact’ results matching the measurements obtained, we are deemed failing the leverage of simulation. Models are an abstraction of the actual systems and can only predict what is modeled. The models will however increase understanding and will predict trends as part of design changes.

Years ago, we developed a totally new system, just when fuel consumption became a big theme. Hence, our prospect made a large effort measuring the efficiency of our device. Their results presented at a large meeting with high level management attending, was not flattering. I had performed some simulations and looked at it; that is not right! The meeting believed the measurements as being the reality and truth, but three weeks later I received an excuse letter of a senior manager. They had found an unconnected wire in the measurement equipment …

I've got a better one than that; a Regulator in a certain country required rollover calculations to be calculated and provided the formula, for the same reasons as the above story - repeatabiity across the industry. In the equation the regulator forgot to include roll centre, and the industry practice was o just make a guess for each different vehicle.

Like
Reply

Well said. The funniest thing I have seen was when a model predicting fuel consumption was made from other simulations.

Like
Reply

Lucas Valente o artigo que tinha falado com vc mais cedo!

Like
Reply

Spot on Edo! I share your concern. I've seen some scary issues related to poor measurement data over the last couple of years. None of them would have occurred if simulation had been used in the first place.

To view or add a comment, sign in

More articles by Edo Drenth

  • Robust Industrial Co-Simulation - 2

    A few years ago, as a result of research efforts this LinkedIn article on co-simulation stability was published:…

    2 Comments
  • 3D V-model

    Earlier this year I enthusiastically accepted a keynote. In no time I had a working title, “From Computer Aided…

    7 Comments
  • Robust Industrial Co-Simulation

    It is already two years ago I concluded non-iterative co-simulation is control engineering. A few filed (and a granted)…

    11 Comments
  • Co-Simulation is Control Engineering

    On purpose, this is somewhat provocative in order to get the discussion started. Co-simulation or tool coupling…

    13 Comments
  • Hard core engineers on the roll

    My pledge for hard core engineers went into a new phase September 1st. That Friday, 30 new recruits on-boarded to our…

  • Hard core engineering for graduates

    Finally I can follow up on my hard core engineering posts on this forum. Volvo Cars will start a third graduate…

  • Hard core engineering, indeed

    A few months back I posted Hard core engineering, please and ended the post with ‘efforts are made to make a change’…

    5 Comments
  • Hard core engineering, please

    Functional CAE tools have developed tremendously over the past 25 years during my career in the business. The…

    8 Comments
  • Heterogeneous - Robust by Design

    As part of our operational development, I led the organisation of a seminar on collaboration with help of open…

    8 Comments
  • Cultural change required in engineering

    We are on the verge of a quantum leap in regards to CAE technologies, but our engineering legacy creates hurdles to…

    1 Comment

Others also viewed

Explore content categories