On Complexity

On Complexity

One of the principles of connected architecture is to strive for ‘continuous reduction of complexity’.

I have had many remarks along the lines of “analytics or data warehouse projects are complex. So, how can you claim you can reduce complexity?”

No alt text provided for this image

Reducing complexity is a way of thinking. Simplification is the result of this thinking, not a goal. It applies to all three processes. The processes interact, so reducing complexity in one process has an influence on the other ones.


A bit on my background

I’m an economist, specialised in operations research. If I would graduate today I’d probably call myself a data scientist.

I’ve been drilled in methodology, of which sensitivity analysis is part. Since modelling is abstracting from the real world, you have to know when the assumptions you make, while abstracting, might break the outcome of your model. The flip side of the coin is that there is opportunity to abstract even more, as long as it doesn’t break your model. Incorporating unnecessary complexity in your model hampers transparency of your model and the verification of your results. Reducing the complexity of the real world without omitting causality, is what you strive for when modelling.

This way of thinking applies to analytics and designing data warehouses. It is what has become my second nature and this is what drives my thinking in reducing complexity. But I had to take it a step further.


What do you mean when you say complexity?

To explain what I mean, I make a distinction between two kinds of complexity. This is arbitrary, not scientific, it only serves as a purpose to tell this story:

  • Inherent complexity - the complexity of the problem at hand, including organizational complexity, data (definition) complexity and technological complexity.
  • Augmented complexity - attributed complexity of the problem that needs to be addressed in a solution.

When confronted with inherent complex problems, people have a tendency to create complex solutions. I don’t know exactly what the psychology is, but my guess, based upon observation, is:

  1. To be able to create simpler solutions, you most be able to strip down the complexity to its core, understand its roots and then know how to address this. You cannot do this on your own. It is a difficult group process that requires a lot of time and (mental) resources and most projects do not have this luxury under deadlines and with budget restrictions.
  2. Creating something complex shows your intellectual prowess.

On the data modelling and data engineering side, designers tend to design for every possible deviation data could throw at them or for every possible future use case. It is like projecting a virtual layer of complexities that the data warehouse might need to address in the future. That’s why I call it ‘augmented complexity’.

There are different reasons why augmented complexity happens:

  • The requirements are not clear and the data warehouse designers have to fill in the blanks, or feel they should
  • The expectations of what functionality a data warehouse should deliver is inconsistent across the organisation. Most of the time, this is due to lack of understanding by user groups what to expect from a data warehouse. Designers try to accommodate conflicting expectations, which evolve over time.
  • Experienced data warehouse designers have been confronted with unexpected conditions which delay delivery, causing frustration and blame by the managers representing the user groups. They try to avoid this happening again.


Those without sin may throw the first stone

I’m not the one to downplay real life problems you have to deal with as a data solutions designer. I’m experienced enough to know that the one thing you thought to be unlikely and left unaddressed in your design, will be the first thing to occur after go-live. Murphy likes to play practical jokes.

I must confess, I was a master of augmented complexity. It gave me professional and personal pride. I proved I could outthink the best and the brightest in my profession.

Over time, I observed that those solutions did not resonate with the intended audience. I learned an important lesson: people where overwhelmed by the complexity and were seeking desperately for simple directions how to deal with it. I experienced that augmented complexity, without paying respect to organisational context, is harmful.


Reducing complexity to fit the limits of human comprehension

The value of information is by applying the insights derived from it. If you accept this to be true, the consequence is that you need to design your solutions to remove barriers for applying insights.

When looking closer how insights arise, you see it is a collaborative proces. Through communication and consensus, insights lead to decisions. What makes BI challenging is that those insights should be incorporated into the information products. This way, insights are redistributable.

This is a process requiring human interaction once again, but this time between developers or analysts and users. Solutions should be made as simple as the capacity of shared comprehension dictates.

How do you discover where the limits for shared comprehension are? It is through experiencing them in an incremental collaboration process. Developers or analysts challenge user assumptions and requirements and validate them against data in prototypes or versions of an information product. Users use these and deliver feedback on the usability and applicability of the products, against the goal of being able to apply insights.

By making the build process transparent, you gain a willingness to accept (designed) limitations of solutions and support for the trade-offs in functionality being made. It creates buy-in.


You can master more inherent complexity once you simplify

There is even a greater benefit. Because of real collaboration and the willingness to accept consequences of limitations designed into solutions, the maturity of an organisation in working with data grows. In other words: the horizon of the capacity for shared comprehension expands.

Over time, organisations are capable to deal with more inherent complexity through more maturity.

And that’s where the real value of ‘continuous reduction of complexity’ is to be found: you master the inherent complexity instead of being thrown around by it.

-------------

More publications can be found on preachwhatyoupractice.nl

This is good stuff, Wiebke Lüther and Mathijs Seegers! Also good to read for you Rene Ranjan William and Pruthviraj S.!

Very good the complexity is the simplifying. By the way is the similarity to Rik Maes 9 -quadrantsintentionally or a result of simplifying?

To view or add a comment, sign in

More articles by Martijn ten Napel

  • AI will force you to face the hidden costs of not governing your data right

    Data management is hot, because use of AI has highlighted its necessity. Organisations experience that their data…

    1 Comment
  • Data and technology: a state of affairs

    This is a sequel on my post about the training material of big technology vendors. It is a long read, but it is not an…

    11 Comments
  • In the church of data (technology)

    When reading people discussing data professionally, it sometimes reads like a discussion of faith, a conviction of the…

    3 Comments
  • Affinity with data: an 80/20 rule thought experiment

    While thinking on the article about the impact of humanity in data related organisational challenges, I had a thought…

    2 Comments
  • Architect of superontwikkelaar?

    Het is enige tijd geleden dat ik met recruiters en inhurende organisaties te maken heb gehad en tijd laat je vergeten…

    22 Comments
  • Why is implementing data solutions so hard?

    That was exactly the question I got. Someone has been silly enough to read an awful lot of articles that I wrote on…

    7 Comments
  • Architecting is driving toward coherency

    I am regularly an involuntary participant in wonderfully confused discussions about ‘architecture’ or ‘the…

  • Architectuur bedrijven is sturen op samenhang

    Ik zit regelmatig in heerlijk verwarrende discussies over ‘architectuur’ of ‘de architectuur’. Mensen praten erover…

    13 Comments
  • On the reuse of data models

    The topic of reuse of data models has been on my mind. It was triggered by a simple post on LinkedIn, but I noticed it…

    7 Comments
  • Snowflake

    This is an article about technology, or rather its fit for purpose. People who have followed me for a while may be…

    2 Comments

Others also viewed

Explore content categories