Skip to content

We're Witnessing the Evolution of Software Engineering

Change is inevitable. Technology advances, new tools emerge, productivity increases. The wheel keeps turning. The emergence of LLMs has had a Big Bang level impact on many industries, suddenly and almost violently changing what we know and how we think about everyday aspects of our life seemingly overnight. Love it or hate it, one thing's for certain. We're witnessing the evolution of software engineering in real time. And it's time to stop pretending like we aren't.

A Gem at an Estate Sale

A new hobby that I've picked up is on the weekends my husband and I like to frequent estate sales. For some reason I feel I don't have enough shit in my house, so I've taken the liberty of filling it with other people's shit. This leads to finding all sorts of wild gems. Do I now have a painting of a hallucinogenic view of Austin, TX from the vantage point looking across Lady Bird Lake

? You bet your ass. But every now and then you come across some fun stuff. One of my favorite things is to look at antique books, especially school books. Recently I came across a book of logarithms. Not a book teaching logarithms, a book with a table of logarithms that you could look up the answer to a specific logarithm given certain parameters.

A logarithm table from an antique logarithm book

This fascinated me.

I'd never seen anything like this. It never occurred to me that someone would calculate the logarithm for an absurdly large amount of numbers and then publish it as reference material. In hindsight it makes sense, but digital calculators were commonplace by the time I was born.

Regardless of how I felt flipping through the pages of this curious antique on a random Saturday afternoon, it got me thinking. At some time, this was useful. There were people whose professions depended on this book. There are likely countless structures, such as roads, bridges, dams, that a book like this was consulted when designing it. People in this profession may have even memorized common logarithms that they worked with on a regular basis. Or they learned how to calculate them in their head quickly. Maybe they even prided themselves on this ability. They were faster than others because they didn't have to go flip through pages in a book every time they needed a calculation. And then, perhaps seemingly overnight, a device became available that could fit in your pocket that could do all of this for you.

Image of the HP-35 Pocket Calculator ad from HP History website

The HP-35, named for its 35 keys

, made this book obsolete. While the price tag may have dissuaded people initially, eventually they became cheap enough that anyone could afford one. And then we put a calculator in the pocket of every person on the planet via their phone. Being able to calculate logarithms in your head became a parlor trick, and eventually not even taught in school. With this step forward in technology, a skill set that some held, and maybe even prided themselves in, became obsolete. It was no longer needed. Sure they could still use it, but the advantage it brought evaporated. Because now everyone was just as fast as they were. And this probably scared the shit out of people. They had been doing something their entire life, they were valued for the speed and precision that they brought, and then suddenly the aspect that gave them a sense of pride was vanquished by a machine.

And this isn't the only time this happened. Let's look at a story directly from the software engineering field.

Dorothy Vaughan and the IBM

If you haven't seen the movie or read the book Hidden Figures yet, I highly recommend it. It tells the story of the African American women "Computers" at NASA during the Space Race whose job it was to perform the calculations by hand.

But as the space race heated up, NASA looked for a faster way to perform these computations. This manifested in the form of the IBM 7090. The IBM could perform over 200,000 calculations per second, way more than a room full of human computers could ever hope to achieve. But Dorothy didn't balk at this. She didn't stick up her nose and say "That's slop," or point out every time it made a mistake. She recognized the potential and the reality of the situation. So she made a choice

. She chose to learn how to program in FORTRAN, and then taught those skills to the other Computers she supervised. And those who were the most successful picking up this new skill weren't those who were the quickest at arithmetic, or could use a slide rule the fastest, or had the logarithms memorized, but instead, it was the person who understood what needed to be computed and why. Skills that were paramount one day weren't so much the next.

It wasn't the implementation, the raw arithmetic, that was actually the difficult part. It was the design. The logic. The ability to see the completed picture and architect a solution. Because if that was simple, the machine would have done that too.

The Crucial Element

In both scenarios above, there is a crucial theme that emerges. It's that expertise and domain knowledge are what bring value, not raw implementation. The mathematician who memorized various logarithms didn't just sit around all day calculating them

. They were typically doing this as part of other engineering work. The knowledge alone was not valuable, it was the engineer's wisdom of how to use it that gave their role value. Same with the Computers who were programming the IBM. It would not have been possible for them to write the programs if they did not first understand the mathematics behind what they were doing. These folks adapted to the new tools that were introduced, and became more productive. Yes there was some overhead in learning the new tooling, but that's life. This won't be the first new tool you'll have to learn in your career, and it sure as shit won't be the last. And let's be real, the alternative isn't great. Because the alternative is to be left behind. To become obsolete.

The Broken Abstraction Cycle

The two examples above aren't isolated incidents. They are a phenomenon that has been happening throughout human history since its inception. It is the abstraction of a process, leading to the simplification of work, and the next step of the evolution of an industry. Let's bring this back to our field, software engineering.

The software engineering field is littered with abstractions. Programming a computer started out by actually programming the physical bits of the machine. Then an assembly language was created to make that simpler. Next we created languages like FORTRAN and COBOL to abstract away from the assembler. Object oriented programming, virtualization, the list goes on and on we continually abstract away complexity to make life simpler.

But then, if you look at programming languages at least, we kind of lost our way. Sure we made more programming languages. And we have made things slightly simpler. But the major leaps became smaller. And when we do make massive leaps, such as WYSIWYG editors, no or low code solutions, or anything that does make writing code more palatable to the masses, these tools are generally mocked. You'll hear things like "That's not real programming." or "You're not a real programmer if you use X language." Once we figured out the modern form of memorizing logarithmic tables, ie knowing how to code, was insanely profitable we started doing it. And then, we took things the other direction! We started making our abstraction layers more complex! Don't believe me? How many load bearing YAML files do you have in your infrastructure right now? Did replacing that monolith with 47 microservices really simplify things? If I ask you to describe your JavaScript build process, can you do so without sobbing? Instead of focusing on making things more human readable, more accessible, we optimized for scale at the cost of simplicity. We architected for cool instead of practical

. Don't get me wrong, these new tools and processes did help large enterprises like Google, Amazon, Netflix, etc. manage the massive scale they were handling. But then we pushed them on every developer who didn't need them. Small startups with three engineers and an angel investor were more worried about multi-region replication than getting a product that their users could actually use. Complexity became cool.

And why did we do this?

Mr. Krabs gif saying "Hello, I like money"

Because it was lucrative to do so! Software engineering salaries are some of the highest paying undergraduate jobs you can get. Being a programmer was marketed as an instant ticket to middle class. We saw enrollment in Computer Science programs at universities across the world skyrocket. And when they couldn't produce programmers at the speed the industry required, we created alternative certification programs in the form of programming bootcamps. Give me six weeks of your time and I'll get you a six figure job! And everyone jumped at this opportunity! Who the hell wouldn't? In this economy?!? But somewhere along the way, we started justifying our paychecks with complexity. "My job should be hard! Look how much I get paid!" We had to protect ourselves. And then we started gatekeeping. We made tech interviews this unholy gauntlet of whiteboard coding and obscure syntax trivia that had absolutely nothing to do with the actual job

. We created a hazing ritual and called it hiring. We got so tied up in our own hubris, we stopped our abstraction journey. We never made the next major step. Until recently.

Nature Abhors a Vacuum

So we had a simplicity vacuum. We kept abstracting things into more convoluted things. New text formats, new languages, new programming models, new paradigms. And the evidence we were going in circles was SO OBVIOUS!

What was this evidence? The fact that everything new and shiny was just a slight rehash on something old. Serverless functions? Do you mean Perl and cgi/bin? Service discovery? You mean dynamic DNS with some extra bits? Oh look Laravel is popular

. MVC coming back! The list goes on and on and on and we kept re-inventing the wheel telling ourselves it was new.

"Well Mason if it's so simple why don't you tell us what we were supposed to do." - probably you, as you're reading this blog right now.

Isn't it obvious? Why are we still writing for computers? Why aren't we just speaking to them the same way I'm speaking to you now? Why is human language not the primary programming language? And before you take a deep breath and come at me with your big voice, stop. There's no need. Because this vacuum has recently been filled. It's over. The next major abstraction has been implemented. The industry has proceeded to its next evolution. And what was the solution? AI.

So How Is This Different from INSERT_OTHER_TOOL?

I know what you're thinking. We've seen tools that were supposed to "be the end of programming" before right? We tolerated the CASE tools of the 1980s, we learned worse programming languages to use 4GLs, we drag-and-dropped our share of boxes, tiles, widgets, buttons, forms, etc in the visual programming. We've been sold snake oil on this time and time again so if you're wary

, I don't blame you. But trust me when I say, this time it's different.

The fundamental flaw with all of these previous attempts was they were all essentially deterministic template systems. Their functionality was bound to what they were explicitly programmed to do. There was no reasoning, no adaptation, no extending. They were static, finite, and limited.

The sheer scale of AI is mind boggling. The amount of training data is vast. If previous tools were created by dozens of engineers, AI has "learned" from millions of developers. Users don't have to learn a new language to use it, they can just speak to it in natural language.

And best of all, it outputs the languages that developers already know and love. It doesn't try to change developers' tooling, it augments it. It can explain why it made a choice to use a function, or how to compile the code. Want it to explain the code to you like you're five, or like you're a senior engineer? It adapts to you and your needs.

Now, could this all still go belly up tomorrow? Sure. Anything is possible

. But the trajectory on this feels different. How do I know? When was the last time you opened up your VSCode UML->Code generator? Never? Ya. Me neither. But how many people are using AI to write code every day now and talking publicly about it? Now think about how many people aren't shouting about it from the rooftops. It's not vaporware. You have to look through the bullshit online from both sides. Some people are claiming that AI is the second coming of Christ and others are saying it produces more slop than a pig farm. Neither of these types of folks can be taken seriously. But look around. Ask your colleagues, your trusted friends what they think. Look for small blogs from people without a lot of followers. These folks have nothing to gain, and are just sharing their experiences. I'm not going to tell you AI is perfect. It sure as shit isn't. But this is the most exciting thing to happen in our industry since the internet. And it would be an absolute shame if you went full ostrich and buried your head in the sand during this absolutely exhilarating time.

Evolving Into a Real Engineering Discipline

For years many have argued if software engineering is a real engineering vocation

. There are many facets to this debate, but one of the primary ones was that software engineering focused too much on the implementation details, and resembled something more of a craft. The rationale for comparing this to a craft was that success depends heavily on individual skill, intuition, and accumulated experience rather than applying established formulas. For example, a mechanical engineer doesn't machine every part that goes into their design for an engine.

However, with the advent of AI and the implementation of software becoming automated away, I believe software engineering is starting to look more like a traditional engineering discipline.

The "craft" layer is actively being abstracted away as we speak. Just as CAD and CNC abstracted away manual drafting and machining for mechanical engineers, AI is doing the same for code. Software engineers won't manually be writing the code for much longer.

When code generation becomes abundant, constraints become the focus. So often we put things off, make compromises on designs, accumulate technical debt for the sake of speed. Speed is no longer an issue anymore. This may very well be the end of technical debt. Or at least, human created technical debt. Will there be AI technical debt? Absolutely. And we'll solve for that as well.

Verification now becomes paramount. We can't blindly trust the AI code

. We must define and understand correctness, and how to ensure it. This is very much a traditional engineering discipline.

Lastly, the science, the theory is no longer optional. It's essential. Algorithms matter. Designing a distributed system? Better understand CAP theorem, consensus protocols, various failure modes and more. All of the theory that computer science undergrads learned and never used in their day job isn't just academic anymore. It's their fast pass to getting ahead with AI

.

Proof It's Already Happening

And you may think I'm blowing smoke out of my ass, but I'm so happy to say that as I was writing this, I already saw an example of this in the wild

. I follow the development of Ghostty on Twitter because it's interesting, and its creator, Mitchell Hashimoto, made a post about someone who used AI tools to do an analysis of Ghostty to find a bug. This user had no experience with the programming language the tool was written in, the operating system they were using it on, how terminals are even built, BUT they knew how to drive AI and had an engineering mindset. They applied their skills and were able to use AI to understand the problem, write a script, and submit a patch that fixed 4 real crashing cases. All with zero knowledge of the code base prior.

Image of Mitchell Hashimoto's Tweet, link below

source: Mitchell Hashimoto's Twitter

This is the sign of a mature engineering discipline. Implementation is removed. Critical thinking, an engineering mindset, and the right tool solved the problem.

Preparing for the Aftershocks

The Big Bang was arguably a good thing for humanity. Without it, we'd probably not be here. However, with any large scale violent supernova explosion, there's going to be aftershocks. And those are coming for us all now.

AI is going to make people quicker and more efficient at their work. What took a team of 10 six months now takes a team of 4 with AI a few weeks. This is most definitely going to lead to job loss. Smart companies will just ship more, but most companies forget there's a quarter that exists after this one, and will sacrifice long term strategy for short term profits

.

Do I think we're heading to a post-work all AI task force? Nah, no one trusts it that much. And the companies that do will be out of business quick. But AI adoption will continue to increase. So be ready.

Because AI isn't going to take your job. But the person who knows how to use AI might, if you don't.


How does this affect software engineering right now? Check out Part 2 of this blog series Code is Now Cheap, Don't Devalue Yourself.

Part 1 of 3 of my Brain Dump Ramblings on AI blog series

Check out the other parts of this series:

  • Part 2 - Code is Cheap, Don't Devalue Yourself discusses AI and it's impact on the software engineering discipline.
  • Part 3 - Coming Soon - My journey with Claude Code and my tips and tricks for getting started.