The myth of full stack developer

The myth of full stack developer

Back in the late 90s, as mainframes gave way to what was called Client-Server architecture, developers were faced with a scenario of too many choices. You could do CORBA, DoTNet, Just plain Sockets. You could have a persistence layer based on Files, Databases. You had a large number of options when you decided to use a database. You could use Informix, Sybase, MySQL, MsSQL, Oracle. You could use C, C++, or even Java if you were adventurous enough. You probably needed a user interface that could be CLI, Java, X, Windows, Web.

Developers suddenly had to learn about more stuff. Additional concepts had to be learned, Object Oriented Programming, SQL language had to be part of each developers repertoire. Slowly Web became the choice of user interface and you had to also know HTML, CSS, JS, CGI, Servlet, JSP and other similar stuff.

As we carried and people looked at the power of elasticity, we moved into the era of Cloud. Most people just saw cloud as a compute infrastructure with unlimited capacity. People took their existing applications and moved those to the cloud. A realization dawned that the cost of moving applications to the cloud was just too much and people started looking at extracting efficiencies in their cloud deployments. We slowly started seeing people writing applications specifically for the cloud. This trend got further strong as more and more start-ups with no baggage of existing compute and applications, straightaway built and deployed their applications for the cloud and were immensely successful.

Most cloud infrastructures are extremely complex systems, they have virtualization, containerization, multiple RDBMS, Columnar Databases, complex networking, multiple languages. Suddenly the developers were spoilt for choice. They started using tools without understanding these completely. Many start-ups while building the first version of their software did not want to worry too much about getting the architecture right, or performance of the product. The idea was to build something, go to market, get the series-A funding. Once you have funds, you can get everything re-architected and done right. Large companies, which were used to working with specialist staff, saw these start-ups succeeded and even overtaking them in valuations and started to ape these companies. Thus, the myth of full stack developer was born. A full stack developer is supposed to be an individual who knows all the pieces of the solution and could potentially, single-handedly build and deploy the solution. This was the case with most technical co-founders who were doing it because they did not have any other options.

Fast-forward to 2014, most of my colleagues had started writing "Full Stack Developer" on their resume. They had to do it to survive in the job market. Whenever I met such people, I asked them specific questions.

  • Do you know the difference between RDBMS and Columnar database? When would you use a Columnar database and when would you use RDBMS. Most people had to rational answer to why would they use RDBMS or Columnar. Many even claimed that RDBMS are obsolete and the Columnars are the new technology.
  • How would they optimize their persistence layer if that was a performance bottleneck? The only answer that I got again and again was, create more indexes. Most people did not understand the reason why indexes are created. How the indexes are different in an RDBMS vs Columnar database.
  • What about data normalization? Most people did not understand the need to normalize the data and the cost of normalizing data.
  • When should one use messaging (like Kafka) vs REST endpoint vs, file uploads vs any other mechanism for client-server interaction? The answers were always, we are moving from REST to Kafka as if it is a new version of REST.
  • When will you use Linux Containers (LXC, Docker), Virtualization, Bare Metal? What are the inputs that go into that decision making? Again most of the time the answers were of the type, my organization has decided to move to Docker. These decisions were taken for no rhyme or reason.
  • What is the difference between White-Box testing and Black-Box testing?
  • How would you know that you have tested your system enough?
  • How do you choose your test cases? Have you heard of Equivalence Class Partitioning or Boundary Value Analysis?
  • What is the difference between Design Language, User Experience, User Interface?
  • When should you use something like Cassandra vs CouchBase vs any other No SQL.
  • What is the difference between STCS, LCS, DTCS compaction strategies in Cassandra and how does it impact your usage of it.

Most of the questions that I had from the full stack developers, there were no convincing answers from them. Most of the time it was an organizational top-down directive on why some piece of technology should be used.

A full stack developer is nothing but a jack of all trades with programming as his primary skill. Most of the staff in a cloud project would have to be full stack developers but in my view, one can not build a world class product unless there are specialists for specific technologies.

For any company, the architecture of the product, design of schema, data model keeping in mind future usage of data, optimization for network traffic etc are extremely important issues and if a typical full stack developer is not able to provide you answers, I would get some experts. If one can't afford full-time individuals, getting consultants may be the answer.

Thanks for the honest amazing insights - Vinay Avasthi - A strong team I believe should contain a few good experienced full stack architects/designers and a larger team of individual technology experts -- in absence of the full stack engineers, one may not choose the right technology for the problem at hand, in this age and world of being spoilt for choices as to the tools to choose... I agree that the technology choice must be made "after" understanding the use cases and problem statement and vision.. Thoughts? Please keep up the great shares... #kudos to a well presented article

Like
Reply

Amazing insights. Thanks for sharing.

Like
Reply

To view or add a comment, sign in

More articles by Vinay Avasthi

  • Why I believe current direction of AI development is just wrong

    I ran deepseek-r1:14b on my desktop with 128GB RAM and RTX 4090 GPU. A simple question, is 43 a prime number.

    3 Comments
  • Story of a funny list

    Recently I was presented with this problem, for a moment I was completely blanked on how to solve this problem. I had…

    1 Comment
  • Baudhayana: The man who knew Pythagoras theorem 300 years before him

    Baudhayana is believed to have born around 800 BCE while Pythagoras was born in 570 BCE. The popular theorem that…

  • Aryabhatta and Eclipses

    Yesterday was a solar eclipse and the usual superstitions about eclipses were doing the rounds of internet. Here is…

  • Arabhatta's method of computing square roots

    Here is the squareroot method of Aryabhatta. भागं हरेदवर्गान्नित्यं द्विगुणेन वर्गमूलेन | वर्गाद्वर्गे शुद्धे लब्धं…

  • Interviewers, nowadays are morons

    Here is an interesting conversation that I had with one of my friends. He told me that he recently attended a…

    1 Comment
  • Strategic Product Thinking

    If companies wish to achieve strategic growth through new products then they need to envision a future that does not…

    1 Comment
  • How to analyze architectures

    The very point of a documented architecture is that it will tell you important properties of the system even if the…

  • Why I think India is missing AI bus

    People are fascinated with AI/ML. Most people with interest in AI/ML have tried to understand what it is and how it can…

    3 Comments
  • CAP theorem as an architectural tool

    When we are writing an application that needs to scale on the cloud, getting an architecture in place that can…

Others also viewed

Explore content categories