Dev’s Perspective: American Software Engineer Compensation.
This is the second article of my Dev's Perspective series, exploring the IT career through my personal employment experience. The first article established the scope and bearing. The discussion is limited to enterprise software only - to answer the age-old question: what it takes to achieve financial independence by doing what you love (programming). The last article in the series explains your 2017 options to fight the slashed wages.
First off, the numbers. Corporate IT (IT departments of non-tech corporations, Initech-size software sweatshops, and the members of the Great IT Consulting Food Chain: from Oracle and Deloitte down to boutique “architecture” firms) currently pays senior programmers $140-180K - essentially the same number, it paid in the 1990s ($90/hr == $180K at 2080 work hours in a year).
Inflation-adjusted (see a very conservative calculator here: http://www.calculator.net/inflation-calculator.html?cstartingamount1=60000&cinyear1=1989&coutyear1=2017&calctype=1&x=102&y=25) it should have been double: $300-400K, which is not surprisingly Google’s Senior Staff compensation in 2017. Yes, you can go to Google, for all I care. I’ll cover it in another article. Read the first one I mentioned above for my reasons to limit the discussion to mission-critical enterprise software, Google doesn’t want to touch with a 10ft pole.
I’d also like to clarify, that I am only talking about accomplished aka “senior” and “team lead” level engineers. Junior and mythical “mid-level” compensation is not worth discussing. If you love programming, you grow to the “senior” level i.e. the ability to work w/o supervision among other things, in the matter of months. If you don’t, choose another profession. Yes, this is Sparta. The industry should employ significantly less people and pay them significantly more. Engineers used to make more than doctors. Sounds like your kind of world?
So, why and how IT sliced the wages in half (it’s a bit more complex, than just “outsourcing”) while Google increased them to maintain the 1990s level? And how one can arrive at that upper middle class income - w/o working at Google?
I got my first paid programming job in 1989, during my second year in college. I remained “upwardly mobile” during the post-Soviet 100% collapse of 1991-96, when the normal economy ceased to exist. In 1992 I invited myself in and got a job at one of three small software companies in a two-million city - w/o any connections whatsoever. And then job-hopped to $3K/mo salary at the end of 1996, when I left for the US.
I was young and naive. I hated the old Soviet system and the corrupt mafia-ruled world that replaced it with every fiber of my soul. But boy, was I wrong idealizing another system: the Western world from upbeat 90’s yuppie movies, where everyone lived in a cul de sac 4K sq.ft. McMansion and drove a Bimmer. It wasn’t bad in 1996, you know. I lived in my own house and drove an NSX in 1998. Fundamentally though, replacing the evil System with a “good” one, and hoping to achieve success by being a good (corporate) boy… that was so embarrassingly wrong. Hey, job-hopping habits (and hopes of raises) die hard.
Everyone knows the story. The “outsourcing” slashed IT wages in half in 2002. They remain such - adjusted for inflation. Thankfully the new generation of employers emerged around mid-2000s: Google, Facebook, and a dozen of others, who offer fair inflation-adjusted wages to senior developers.
Here’s another link for you: http://www.msn.com/en-in/news/newsindia/h-1b-visa-bill-introduced-in-us-minimum-pay-more-than-doubled/ar-AAmrdMz
It wouldn’t be a stretch to associate H1B, L1, and similar work visas with IT, considering the overall statistics of who’s brought into the country. I don’t want to talk about the new legislation. Even if it passes, H1B regulations have become a formality, circumvented by limitless L1 and other visas. Besides the goal is to physically move the engineering to the crowded third world anyway, so no one wants programmers to stay in the US more than six months - allegedly sufficient to learn the domain and go back to the low wage region.
It’s pointless to wonder how the minimum H1B wages stayed unchanged for almost 40 years, and the alleged “people’s party” did nothing. What’s more important, than politics, is the spot-on 1989 $60-80K salaries I witnessed even in the late 90s (roughly translated to its 2017 equivalent of $120-160K. That’s how CFOs and other MBA bosses view software engineers: 1.25x more valuable than Accounts Receivable clerks. It’s always been like that unfortunately.
Whether those greedy CFOs could actually realize their wet dream of reducing software engineers to accounting clerks, is a different story. 20 years ago, before the mass “outsourcing” the supply and demand still worked - through hourly-paid contracting: $60-80/hr. It has nothing to do with “job security” since everyone can be outsourced now. No one could back then due to the physical lack of people. The overpopulated countries full of “bodies” weren’t discovered yet by said CFOs.
It certainly wasn’t because of pathetic benefits of roughly 10% in our pay range. 2x wages ($60-80/hr x 2080) were the straightforward job market response to rigid corporate salary grids. I am leaving the 10% benefits and the 7.5% social security out of the equation for the sake of simplicity.
For some superficial HR reason, it was significantly harder to land a crappy $80K “permanent” (that is until the next recession) job, than an “18-month plus” (meaning ongoing - until the same recession) $80/hr contract ($150K adjusted for benefits). Only Milton (Office Space character) looking losers seeked “permanent” jobs in the 90s anyway.
It’s tempting to think up an adequate (low) “compensation history” and apply for a $110K job to test my theory: “hiring a genius unbelievably cheap”. Though I am afraid, they won’t believe me, as I don’t look like Milton.
The same goes for the “entrepreneurial mindset” emphasized by many JDs published on Dice and LinkedIn. If anyone finds out I am a real founder, they’d run from me faster than a cute girl runs from a hooded serial killer in a cheap thriller movie. A typical employer’s view at the entrepreneurial drive, even and especially at startups, is limited to the sacrifice and perseverance; certainly not the financial independence goal.
One way or another, I’ve never looked exploitable. Once you’ve made it known, there is no way back to salaried “permanent” jobs even if you start believing in phony “perks” in exchange for lower compensation. Add the HR staff’s envy of six-figure income. Add the restriction imposed by your boss salary: low, but fairly adequate for a mediocre non-technical dime a dozen facilitator and mitigator. Add the post-recession and inflation (e.g. after an oil “crisis”) denial to acknowledge the recovering wages - complaining how “hard it is to find good people”. At the old recession rates, duh - after it ended?
It creates a no-win environment for both sides. If anything’s permanent within the current age-old IT employment arrangement, it is the exodus of talent. Whether one goes to Google or simply job-hops to a marginally better paying project, he/she permanently closes the door behind him. And if you stay and cope with low compensation, you’ll surely find a way to do nothing by fooling your non-technical bosses with estimates. Yes, you are putting yourself at the top of the next recession layoff list. Just like you are after a life or death salary/rate negotiation over $5K.
Make no mistake, your MBA bosses are looking to get rid of you at the first opportunity - replaced by pliable and dirt cheap bodyshop-supplied code monkeys. You are going to be on that layoff list anyway - at the top like a rebel or bottom like a good boy, doesn’t matter. While you could have made or saved your employer millions in exchange for an extra $20-30K.
BTW, guess I haven’t experienced the joy of “performance bonuses” during my career. The highest I’ve ever received was $5K. The highest promised, contingent on the cyclical “economy” and jumping through countless HR hoops, would have not brought my annual compensation even to $150K I could count on as an hourly-paid contractor.
Thankfully one company (Google) ended that “You don’t pay, I don’t work” standoff by simply acknowledging the adequate programmer compensation: pre-outsourcing contractor rates converted into permanent salaries and adjusted for inflation. A handful of top tech employers followed the lead (to a certain degree). Wondering about the numbers? Adjusted for inflation, the current expert developer wages should be $120-160/hr. Yes, per hour. With a 40+ hour work week.
They actually are - recruiter billing rates. With the developer getting $90/hr at best. I am not going to speculate about IT recruiter markups back in the 80s and 90s. My cofounder is a recruiter and I know that 30% is considered outstanding outside IT. I have a feeling it was significantly less than today’s 50-100% in IT. I knew a girl: smart, but lacking CalTech or MIT credentials. She wasn’t and still isn’t well-connected. Though she easily made $90/hr in 1991, amidst the alleged defense cuts recession. That further hints at sane recruiter markups during that era.
What would you expect? We are at the bottom of the corporate food chain, right below the “recruiting” middlemen (several layers of). They squeezed us at the first chance. Outsourcing? Not really. You and I weren’t needed in IT in the first place.
Look back at all your projects in the past 10 years. Were they really critical? Recall how you spend your day at the office. How many hours a day you actually spend engineering something real, making the difference for your users? It’s “TPS Reports” at best. Am I right?
Isn’t it obvious? Your CFO found someone cheaper to do that meaningless crap. I remember my cultural shock when I got a programmer job on my second week in the US. It was about the effort, not results: “looking busy” surrounded by hyperventilating Miltons, who perfected the art of looking tired and concerned.
I should have left toxic IT, when I first started to wonder why I am writing DAOs, DTOs, and other glue code for $150-180K. The middle class prosperity was still going strong in 1996, so I attributed my employers “generosity” to the overall country’s wealth. I hadn’t a slightest idea, what difference my DAOs and DTOs made, let alone the purpose of the software we developed in relation to my employer’s main business.
The banks and insurance companies I worked for sold the same overpriced thin air, they sell today. I didn’t care. They scammed somebody with their “product” (including myself as an insured driver and homeowner), and I scammed them with my pointless programming. The country was rich. It could have easily supported its citizens, if everyone stopped working. But it was still customary to work for living, so we pretended to, scamming each other.
I longed to do real programming. My employer wasn’t interested. I was hired to bang on the keyboard, so a couple of middlemen could charge their fees. No one wanted me to grow professionally, since I’d have asked for a raise.
Of course I never asked anyone’s permission to do that or waited to be sent to a paid official class to learn some technology. Your professional growth is your own business. Monetizing that growth is your business too. Unfortunately at that time I only looked for employers to reward me for the new abbreviations in my resume.
I hate to sound like a Formula 1 mechanic who moved to a small town with a rusty gas station and a Jiffy Lube. There were no Google, Facebook, or Amazon before mid-2000s. Not the way everyone knows them today - for their $400K+ compensation packages. And as far, as the self-funded entrepreneur route, the late 90’s technology wasn’t adequate to develop robust enterprise systems on my own, w/o starting capital. I know, excuses, excuses…
I learned about the infamous 70% IT project failure rate in 2003. Out of all sources, it was mentioned in an IBM WebSphere book. I googled it and found Michael Krigsman’s “IT Failures” blog (now on ZDNet). Perhaps it wasn’t Krigsman, who came up with that infamous number, not to mention Krigsman has backed down and doesn’t publish it anymore. Can’t be enemies with Oracle, Deloitte and the rest of the Great IT Consulting Food Chain to make a living as an industry observer. In any case, as much, as I want to indulge in writing a script for The Office Space 2 about those 70%, let’s talk about the 30% of successful IT projects.
Capable programmers did work somewhere in the late 90s and early 2000s. Someone utilized their expert skills (to build the 30% of successful enterprise systems) in exchange for a more or less adequate compensation.
I chose the contractor’s path after moving to the US, and changed my job (project) every second year on average. I’ve worked with many teams and heard even more stories about other employers, not to mention hundreds of interviews. I consider myself a professional job seeker. So I know a thing or two about the job market, salaries, and work environments. I can attest to the 30% success rate in the late 90s, which has gradually slipped to 10% and lower over the 17 years since Y2K.
Things were alright in the late 90s. The system worked. The corporate money did trickle down to regular engineers w/o top education credentials or connections. One could make an upper middle class living just by using his brain… to do his job. Applied of course to the 30% of successful projects: always intellectually rewarding and fairly compensated to afford a McMansion and a Bimmer (NSX in my case). It was easy to find those 30% of projects through standard job seeking channels: resume-posting sites and recruiters. Even with a high recruiter markup. No mysterious “networking” was required.
I remember a state of the art insurance system in 1998. Distributed N-tier architecture, C++, advanced JS with Ajax (before anyone knew that term), XML, HTTP services (years before SOAP and REST were invented). We invited a third-party “enterprise architect” from a reputable consulting firm to take a look. He was speechless. That was a regular internal IT department project - during the last years of VB and Access era. And VB was considered the new tech, as COBOL was still strong. Google didn’t exist. Oracle kept selling its 1980s Forms - the great ancestor of MS Access.
Forget the money for a moment. Let’s talk about technology. Utimately it is your technical contribution that counts: inventing your own tech and perfecting/combining others’ inventions. Technical inventions make things faster and reduce costs for your employer, allowing for greater market penetration and higher profits, which should translate into your wages.
It goes w/o saying: there is a limited number of inventors on Earth - the fundamental supply and demand rule, at least in theory. The skewed IT job market supply and demand deserves a separate article. In short, the supply and demand stopped working in IT job market. Like many, I used to think the problem was on the supply side. It’s not.
Whether your boss assurances, that you are not “outsourceable” due to your skills turned out to be true or not, you and I are different from “discount resources”. And if IT projects remained the same, there would have been plenty creative and well-paid jobs for us, none of the rushed and frightened code monkeys can possibly learn and do. The problem is, that work itself disappeared.
The projects have been dumbed down to accommodate the cheapest most junior workforce from you know where. It was only 30% of the job market to begin with. Now there is no place for us in IT anymore. Our services are no longer needed. The 70% of DAO and DTO coders became 100%, and it is perfectly adequate to pay them half of the 90’s wages for writing boilerplate code.
Killing the demand has more profound consequences than inflating the supply. Forget wages. There is simply no place for you and me in IT. Curious why? Read my next article: Skills vs. Salaries? Does Supply and Demand Still Works? Or skip to the last article in the series explaining your 2017 options to make the money you deserve, as an expert software engineer.