Development Practices I've Changed My Mind About
Over our careers, and as we develop, our views on technology and tools tend to change. This is entirely normal and to be encouraged. I thought I'd throw together a series of software development topics I've changed my mind about over the years to illustrate this process of change.
Some of my views on development come from the security consultant half of my career, where I've had the opportunity to review a lot of different people's code, learn from them, and get a glimpse of different developer communities I would never have interacted much with as a developer.
None of this should be taken as authoritative or an attempt to tell you your own views are wrong, and if you take away anything from today, please let it be to be open to new ideas. We tend to spend a lot of time in software development working with those who hold similar views, where others outside our bubble actually have pretty good patterns, ideas, and behaviours.
Also, please remember that my own development is occasional, often small scale, and many of my shifts of view have in fact occurred from reviewing large quantities of other people's code, rather than writing it myself. This may give me a bias towards reading code, as opposed to writing it (though I suspect the world could use more of that bias!).
I will definitely exhibit preferences towards safe and secure development, rather than whatever gets the job done fastest.
The right tool for the job, vs. the right tool for the company
One of the common phrases you hear in development is that we should aim to use the "right tool for the job". While it's certainly true that a particular language or framework can make certain jobs easier or harder, there are considerations beyond the technical.
When we select a language, or framework, for an application, we create a need to maintain experience in that language or framework so we can make necessary changes. If you don't plan to be in the same job forever, this means your employer or customer needs to hire someone else with that knowledge, replace the application, or accept they can never change it.
Before you select that fancy new framework which looks like the right tool for the job, consider whether you have a good enough tool, that uses the same experience as other things.
The importance of languages or frameworks, vs. the importance of communities
If you listen around any gathering of developers, you'll hear opinions like "PHP is insecure" or "JavaScript is only good for X" or "Java applications are excessively large". Well, guess what? You can write anything with any of these. Great code, fragile code, massive codebases, tiny scripts.
When you start to look at communities, what you discover is sub-cultures. These cultures are not necessarily one-to-one with a language or even a framework. There are enterprise developer cultures writing PHP, and minimalist developer cultures using Java. They may have some quite different ideas from others using the language, but tend to have found a consistent balance.
Ideas have a tendency to circulate and survive within these communities, and they have relationships to each other. I can see the influences of Rails on other Model-View-Controller pattern frameworks. Once you know what to look for, the fingerprints of .Net and possibly Java on the Silverstripe (PHP) framework are obvious.
Some sub-cultures will consist primarily of learners, or people who are not full-time professional programmers, and guess what? They haven't learned everything yet. We can help them.
Even the most professional communities can also repeat and replay bugs. I see the same silly cryptographic bug being repeated again and again in the C# community, because an MSDN article spread it (encryption using AES-CBC with a static IV, if anyone is wondering). This article had a massive effect, because it came from a trusted authority. Other communities have similar patterns, that we can only hope to improve over time.
Of the two, I will now always be more concerned about finding a community that is reasonably consistent with my ideas, than I will be dead set on a particular language or framework.
It's worth compromising for the sake of convention
Another aspect of communities is that they tend to form conventions on ways of doing things. Why? To make code less surprising and more readable to each other.
It turns out conventions are critically important to communication. Software is as much communication with the people who will read our code (sometimes including our future selves!) as it is communication with a machine. Even before we add comments, every line of code you write contains things that a computer doesn't care about.
When you use classes, vs. when you use methods, naming conventions, formatting (within reason), patterns you do and don't follow, tools and frameworks that everyone uses, and the general process of how we build software all have norms in a given community, and we should aim to keep these reasonably consistent both within our own teams, and with broader groups of people who could be our colleagues down the line.
An example would be that, despite the views on databases I'll discuss further down, I've worked for years in communities that held different views. Despite my own conviction that there are better ways, writing systems that are understandable to my peers was more important.
"You need to use tool X to scale"
We are frequently told that particular tools don't scale, or that a particular practice is required to future-proof our applications for "the enterprise". What I've come to realise over time is that these statements are more noise than substance.
Whichever language, framework, or database we have used, chances are someone else has had to scale it much further than a business application you're writing ever will, and has documented that process.
Clouds give us instances on demand. Half the intensive stuff is probably done by some SaaS service anyway. Long-running processes can be queued and scaled. Customers can be given their own instances. Databases can be sharded.
You probably don't know which of these techniques will be required until you're at that scale anyway.
Particularly in a small country like New Zealand, we very rarely require a scale of operations, that on the technical front a small development shop cannot achieve. We will almost always run out of every other limited resource we possess - capacity to on-board and train people, capacity to support our users, capacity to make the changes our customers need, well before we hit the limits of our stack.
Recommended by LinkedIn
If you are writing standard business applications, you can use well-tested technologies you're familiar with and any major language that you can continue to support, and you'll probably be fine.
Static vs. Dynamic typing
My first exposure to developing web applications from start to end was in Ruby on Rails (version 0.9). While I had some prior experience with CGI, PHP, etc, this was where a lot of my foundational ideas came from.
One of the core things I believed at the time was that static typing was a pain, and a bunch of extra syntax that accomplished very little. I "knew" what types will be passed to my methods, and how they would behave.
Having spent more of my career now focussed on reliability and security, I've come to understand that static typing is actually quite useful when used well. The major thing it does is to move errors to compile or deployment time, rather than seeing issues for the first time in running code, and that's absolute gold for reliability.
Static typing does not necessarily need to be painful. Some languages (e.g. Python, Typescript) allow statically and dynamically typed code to be mixed. Types can be inferred in most cases without having to be explicitly declared (e.g. Haskell), and can do a bunch of useful things to ensure consistency (e.g. non-nullable types in some languages).
These days, I advocate for all input from untrusted sources to be validated strictly against a schema. Static types with a few annotations make a pretty good structure for a schema. Creating a data structure representing every request type is a norm in some communities (e.g. ASP.Net) and very effective at reducing unexpected input.
Databases
Another side-effect of growing up in the Rails community was that I initially absorbed the community's views on databases: That the database is just a place to put data, we should abstract away which database engine we're using, integrity should be managed in the application, and every record should be looked up as a combination of a type (class) and numerical ID (maybe a UUID, if you're feeling brave).
A university course and its very memorable professor challenged this for me - I went in to a course with one set of ideas, encountered a different set of principles, and about half-way through the course pretty suddenly changed my mind. What had seemed like a dry, academic approach to data models suddenly demonstrated a series of practical benefits.
The database engine can do a lot to protect the integrity of your data, if you let it. It can check that, when you refer to another record, it actually exists. It can ensure that, at any point in time, and for any application viewing the database, this data is consistent.
I love consistency checks in the database, where we can account for race conditions, concurrent updates, etc. I love compound keys and want every record to propagate its identifiers to dependent rows. I don't care whether entities at the root have IDs, names, codes, etc, but for example I want every record that belongs to an organisation to contain its identifier.
In my experience, this is very far from a waste of resources. These identifiers are present wherever I need to check them, integrity checked, and even complicated data schemas involving diverging and converging relationships can be validated.
There's a bunch of myths around databases and performance. It's no longer the case that adding a column to a database necessarily locks everything in many engines. Updates to identifiers can be propagated automatically, while preserving integrity. Remember that what you hear may be outdated.
Object-Oriented Programming
If my teachers were to be believed, there's only one sane and responsible way to structure your programs, and that's Object-Oriented Programming (OOP). OOP creates little bundles of data and behaviours (code) which is associated with a class (or kind) of object.
OOP tries to provide two major guarantees - modularisation and encapsulation. Modularisation is the separation of code in to modules, such that changes to one rarely break things in another. Encapsulation is the separation of an abstract interface from the implementation - it turns each piece of code in to a black box where you're not supposed to care about how it works on the inside.
We see tutorials talking about OOP structures with familiar things like cars, cats and dogs, showing how we can abstract each in to a vehicle or animal, with methods like "startEngine" and "eat". These talk about how useful it is that we can deal with these things without needing to know the details of how something is done for a specific type.
In reality, the mapping from OOP classes to real-world types of things is tenuous at best. What would be the real-world equivalent of a TaskInstanceFilterFactory? Is there really much value in saying it inherits from FilterFactory? The OOP features in these cases rapidly get broken up in to tools, rather than representing a real and intrinsic relationship.
It turns out that the important tools are available in almost every language, without necessarily the use of OOP structures. Every language has some sort of modularisation. Every language has some ability to abstract away implementation. Every language can represent a data record containing other data records, and somehow control access to them, if only by convention.
OOP does not have a monopoly on good design practices or good patterns, and there are communities out there that don't practice it, perfectly responsibly. I could argue that Haskell does modularisation and encapsulation better than any OOP language I know, albeit in a completely different way.
Summary
In summary, there's no one right way of building software. If you're even slightly receptive to change, your views will and should shift over time.
Look to your peers for norms and conventions that improve communication between you.
Look to other developer communities, and shamelessly pilfer their best ideas.
I'm not trying to start a flame war, so please don't come at me!
"Particularly in a small country like New Zealand, we very rarely require a scale of operations, that on the technical front a small development shop cannot achieve." OMG this!!!
Nice! It's this type of continuous learning and being drawn to new things that make this sort of work as fun as it is. One reason I enjoyed working with you was you always brought new ideas.
Great writeup 👍