I hope I donāt come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didnāt have a ton of competition. In that kind of environment, there is space for a focus on the actual fundamentals, the things-in-themselves, the theory behind the action. Most of this stuff is good and it was a beneficial situation to have that buffer space to allow it.
But ultimately business reality has changed, largely because achieving business goals is dramatically easier with AI tools. This undercuts a lot of the focus on building solid fundamentals, and in a lot of cases thatāll come back to bite the business. But in many scenarios it wonāt, and the industry will rumble on.
Those of us working in marketing or journalism or education have already been forced to accept this new reality decades ago, largely because of inventions by software developers. Now devs are just late to their own party.
I've worked 10+ years as a developer in France where salaries weren't too high to begin with, but I certainly noticed the added competition as it was harder to find a job. I stopped "fighting" for a high-paying role so my experience didn't provide net gains but it still protected me from inflation. The net "gains" rather came from spending less by moving from a rent to a mortage and then making it smaller.
I'm OK with this now, it is what it is, but these years weren't smooth as there were ups and downs and a down after an up can be stressful if you're not ready for it.
In many European countries there aren't high salaries and above-average job security for developers, you are considered an office worker like everyone else, this isn't Silicon Valey over here, especially if you come from the Southern Europe countries.
āThe actual fundamentals, the things-in-themselves, the theory behind the actionā donāt go away, they change.
Programmers used to work with punch cards, then assembly, then low-level languages with odd quirks. Today few developers even think about first-party code size, micro-optimizations, register allocation, etc. LLMs are just another abstraction.
A developer with the ideal AI code writer (which weāre not at yet) must still think about idea, design, scope, etc. like a product owner or manager. And these concepts have theory, sometimes even math (e.g. time complexity).
EDIT to comment on the article: all abstractions are leaky, but sometimes it rarely matters. Today we do still need to understand code quality and architecture when working with LLMs, or the software will get bad enough that it will affect the company. But maybe not next year. An analogy: stack vs heap, memory allocations, etc. still matter in high-performance software, which isnāt uncommon, but programmers almost never think about register allocation.
This isnāt harsh at all. As Iāve commented before (but this time as well I do no have the receipts/links), itās been reported that highly paid programmers in the US also brought in a ton of profit; it was not at all the case that their employers had some thin profit margins because the labor was expensive to them. We talking one million USD profit for a 100K USD salary.
They didnāt even earn anything close to what they were worth. According to Marxā Labor Theory of Value anyway.
However the dice fall now, one of the possible outcomes is that the tech billionaires take that 100K USD for themselves. The very deserving individuals whose job is to sit their arses on automation assets.
Meanwhile workers from other sectors can gloat about how they are now in the same boat as them. The boat of accepting your ever-meagre reality.
> that highly paid programmers in the US also brought in a ton of profit
In Germany for instance I've seen many a company that treated their programmers as a cost center and they actually were (probably a mutually reinforcing self-fulfilling prophecy).
Too many instances of programmers being deployed in such a way that I couldn't possibly see a way that they would get back even that meagre investment that was being made. Fully irrational dev teams doing useless busy work.
Most German "startups" used to be replaceable with Zapier and Pipedrive. That has probably only gotten worse with the advent of LLMs.
I'm in a similar position to the OP, unemployed for about 10 months, with tons and tons of applications sent both remote and local, and yeah not sure where this is gonna go or what I'm supposed to do. Also disabled, my eyes don't work so that automatically removes many, many non-software jobs I'd otherwise do from the equation.
Don't even really have anything else to say other than that, but maybe commenting it somewhere helps someone else realize they're not alone. I don't know how that helps you or me, but that's what I got. Maybe there's still something for us somewhere, but it is very difficult to stay motivated, and I don't have an answer.
I'm not in your situation, but I've hit the bottom of the despair and found the inner "fuck it we ball" within me. I don't know what's an option for you, but I'm learning bartending, stocking shelves, and having irresponsible sex with the young women I work with in retail.
I enjoy software development and hopefully one day I will return to it, but I am but one tiny kernel of corn in such a mighty ocean of shit so I might as well right the waves instead of fighting them. Maybe your calling is scamming Indians or scamming Americans or scamming Indian scammers. You aren't alone but the attitude you have will never stop mattering. See if you want to go back to school, start a tutoring program for kids. Motivation is for morons, do something.
I have spent months adjusting my resume, applying for all
jobs where my skill set may be of use, building
proof-of-concepts using Claude, and doing cold outreach to
anyone who may be interested in my potential products or my
services. The well has gone dry.
A major quandary companies are finding themselves in is "resume fraud", which can be defined here as being inundated with applicants only to find 99%+ have used GenAI to produce a bogus work history tuned to satisfy the job posting. To the point where many companies simply give up trying to identify "real" applicants via online submissions.
It is analogous to email spam in the 90's, before anti-spam technology was mature.
I've been thinking this for a while now, but I feel like especially with the rise of crazy salaries in AI research, it's time for software development to have its agency moment. Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent, and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.
Iāve seen a setup like this in software for some specific high-demand SAP (ERP) consultancy roles. The nature of SAP migrations are per-project in nature (you wouldnāt want to migrate your companyās ERP all the time). The person had such a skillset that they had what is effectively an āagentā who would negotiate their next job assignment. The agent was even baked into the contract with the client as a party, I donāt recall how much of the hourly this agent would get, but they were invoicing the company separately.
At least this is what I recall.
Meta: this is probably the first time this year where I use the word agent to refer to a human. Feels odd even.
> Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent ...
This is what recruiters in quality staffing firms do. Granted, there are many staffing firms which are worthless body-shops. But those are not reputable. :-)
> ... and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.
The equivalent has traditionally been hiring managers who work with approved staffing companies, both to ensure those companies provide value as well as to foster an understanding of the people/skills needed by the organizations.
Wise organizations use multiple staffing firms and perform internal audits in order to minimize complacency/corruption.
I feel the pain, and if I get unemployed now on my 50's, most likely I will do something else outside computing.
Everyone that praises how they get more productive always forgets that means big corp now needs less of us.
I work on enterprise consulting, and have watched how the change into managed cloud infrastructure, followed by low-code/no-code tooling, has had an impact on team sizes, meaning less devs for the same outcome.
AI driven development is reducing those team sizes even further.
In many European countries, gettting jobs at a later age is really an almost impossible task, the easiest solutions end up trying to get early retirement status, or go self employed, which also isn't without its own set of complications.
More than ever is time to be stoic. Have things but live as having nothing. But as obvious as the author says it was predictable too.
By now... I see in my country high prices for laptops with only 4Gb of Ram and Celerons.
It could do wonderful things if in 2000s people didn't buy the argument that hardware is so cheap so lets write unefficient code. Same hardware that could play an Youtube video in 2000s today cannot even open the website. Electron send hugs...
Now people are mad about AI until when? Oceans be drought like in Oblivion movie?
And professionals? The generation of specialists will pass... and people will blindly depend on Ai soon if the course of things doesn't stop or at least be corrected.
I think the author could have brighter days in future (and still thing in present in some hidden niches) as knowledge will always precious.
The main lesson I have is buy less TI and every buzz promises and find the place where knoledge and craft walk side by side.
But all that advice. Never worked out.
Be stoic. Be supportive. Be.. to Not to be.
Accept that beeing some eldritch gods lunch is your destiny.
Stoic is what we expect the cattle to be as it goes up the ramp.
Do not go quietly into the night, rage against the dying of the light.
I mean... so did a lot of other rulers. As far as emperors go, Aurelius wasn't that bad. You have to judge historical people by their peers, not by your own modern standards.
I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies as more of a liability than a virtue.
More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing. They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.
We've been moving to React, replacing an internal framework that's worked wonders for us we've been using for over a decade. The biggest part of the move is "hiring".
My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".
Everything is giant overbuilt and terrible because most people never bothered to learn even a single level up from where they do most of their work. The people that do become unhirable. Everything takes hundreds or thousands more cycles and electricity it should because people can't be bothered to understand what they're doing.
> They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.
That's because they don't know what to build that will be a successful product, so they essentially try to brute force this question of "what to build" by trying different ideas quickly and see which one will stick. And in this quick iteration loop people just throw bunch of stuff together to make something and once that something gains traction they will keep piling on top of that shaky foundation.
Hardware is cheap ; human labor is not. Companies have figured out that the best way to extract monye from customers is to give them something that barely works now, rather then something that works great later.
Is it, though? Can we really keep saying that "hardware will always be cheaper than human labour" when RAM prices are soaring, GPUs are becoming prohibitively expensive, and we're looking at a probably chip shortage?
I think the era of "poor software for fantastic hardware" is coming to an end.
RAM + GPU are getting more expensive but mostly for applications that require a lot of it like AI.
The hardware cost for regular applications has not vastly increased (especially when factoring in inflation).
Spending 2x development time on a problem often is not worth it (or only with large deployments).
UI development is an even more special case here.
The customer buys the machine which runs the code, not the company.
So sadly "good enough" is the standard.
One example for me here is the "switch product option" button on Amazon listings (e.g. switch green to blue color, smaller to larger model).
On my phone this sometimes takes >5 seconds to properly load.
Horribly optimised.
> I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.
This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.
I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.
What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.
> My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".
The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.
There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.
> Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn.
Design time, code time, compile time, run time. Why all that potentially wasteful upfront work?
The next step are shipped applications whose help menu is a chat interface that responds to all user questions of the form "How do I ...", with a short pause to add a new hack to the existing pile, and then some upbeat instructions.
In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.
> In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.
If I had to sum up the zeitgeist of the '90s techno-optimism it would be this persistent, confident prediction that once people just learned _how_ to use computers, and everyone is a power user everything will be fine! Despite the mounting evidence that actually, no, like everything else in reality the distribution of skill is a bell-curve with the median sitting uncomfortably low for those who, to quote OP, "lived on IRC or in the bash terminal".
Free universal education didn't fix this problem, LLMs won't fix this problem. Man's natural paucity is no longer in the availability or accessibility of knowledge. The liberal ideal that all we must do is empower the individual turns out to not have been the solution to everything forever.
But hey, being self-aware enough to make productive use of this new technology is probably _some_ kind of edge.
Your words resonate with me. Even before LLMs, Iāve been disappointed with the general direction the software industry took in the 2010s. Todayās software industry is not the industry of Licklider, Engelbart, Bob Taylor, Alan Kay, Woz, Stallman, Ritchie, Thompson, Pike, Joy, and many others whom I admire, who helped establish an ethos of computing that fostered a sense of freedom, creativity, and wonder.
Instead, what we have today is a computing ecosystem dominated by powerful players who care about money and control. Speaking from the standpoint of a Bay Area resident, since roughly 2012, the field has been increasingly taken over by people who are in it for the money. Combine that with Alan Kayās observation that computer science is a āpop cultureā that often lives in the moment and has little regard for the past, and also combine that with the āmove fast and break thingsā attitude that permeates modern software development, and this has created an environment that seems hostile to the types of nerdy pursuits that the industry once encouraged. The working environments of many major software companies and the products they release are a reflection of the values of the companiesā executives, managers, and shareholders.
While Iām not anti-AI, I see agentic coding as another step in the direction that the software industry was already heading towards, where it can move even faster and break even more things.
There is still wonder, joy, and freedom in computing, but I feel this is increasingly confined to the hobbyist world and certain niches in research environments.
sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).
I guess because Iām in game dev maybe, but in all my jobs knowing about the underlying stack has either been necessary knowledge or highly regarded.
I canāt think of any time in my career where knowing about the internals of the stack was ever frowned upon or where itās been anything other than an advantage (especially when hunting bugs). I must have been lucky.
people will accuse you of "gatekeeping" because you shouldn't need to have any knowledge or skill to do stuff. those things are unimportant, even bad, because anything requiring those is inherently exclusionary. lmao.
This has been obvious to me since I graduated with a BIT majoring in 'Software design.' I literally went to university with software design and software architecture being my core interests.
When I graduated, I was shocked to learn that no company cared about any of the architectural concepts that I had learned. UML class diagrams, sequence diagrams, ER diagrams, etc... had been on the way out. At one point, as internet companies where scaling up, there was a brief resurgence of interest in sequence diagrams... Especially as a communication method when explaining complex bugs or complex message-passing scenarios. But it didn't really last. Nowadays most software is riddled with race conditions and deep exploitable architectural flaws. Cryptocurrencies have been victims to many such attacks. Billions of dollars have been lost to race conditions... And that's just the ones which were discovered. They are notoriously difficult to find post-implementation.
The programming primitives that we're using today aren't optimized to avoid race conditions or even try to encourage good concurrency patterns; quite the opposite; they encourage convenient but disorganized parallelization and they're optimized to put the focus on type safety which is a far less concerning issue. A lot of people who were rightly alarmed by gaps in schema validation (which is critical at API boundaries) became overly obsessed with type safety (which is a broader concern). I have built some async primitives for Node.js, nobody cared! NOBODY! Other developers have had the same experience with most other languages. I think only a few niche languages like Elixir actually treated it as important. But nobody even acknowledged that the problem could be remedied in existing languages. It's so bad that it seems as though some people wanted it to be that way.
The term 'concurrency safety' doesn't even exist! Some have a vague idea about thread-safety OK, that's very specific to one particular concurrency primitive... but what about the concurrency of asynchronous logic (much more common nowadays)? I have felt thoroughly suppressed in that regard in my career.
The only voice on the subject of architecture that got through to the 'mainstream' was Martin Fowler (one of the inventors of Agile software development). After that, there was Dan Abramov of Redux fame. Some notable opinionated architecture books were published but none really identified the underlying essential philosophy to good architecture.
The best, most succinct quote I ever read on the subject was from Alan Kay (inventor of OOP) who said "I'm sorry that I long ago coined the term 'objects' for this topic because it gets many people to focus on the lesser idea. The big idea is messaging."
I like that quote for many reasons; firstly because it shows wisdom, secondly, it tells you what the issue is, very simply and, thirdly, it hints at the importance of 'focus' in this discipline where we are saturated with thousands of complex overlapping and partially conflicting ideas.
I think the FP trend was somewhat of a red herring. Same with Type Safety. Yes, they were useful to some extent, there are some really good ideas in there, but people got so caught up in them that the most fundamental area of improvement was ignored entirely. To me, the core value proposition of FP can be reduced down to "pass by value is safer than pass by reference." Consider that in the context of Alan Kay's "The big idea is messaging." - Is an object reference a message? NO! A live instance is not a message! Precisely! His point supports pass-by-value, furthermore, it encourages succinct/minimal parameters.
Good architecture is rooted in 2 core concepts. 1. Loose coupling. 2. High cohesion and you achieve those by separating logic + structure from messaging. The biggest mistake people make it passing around structure and logic as parameters to other logic. You should avoid moving around logic and structure at runtime; only messages should move between objects; the simpler the messages, the better. And note that 'avoid' doesn't mean never but it means you have to be extremely careful when you do violate this principle and there should be a really good commercial reason to do so. I.e. You should exhaust other reasonable approaches first.
My journey is quite similar. My mental model got a huge boost after I read and understood Leslie Lamports early work and the work of Edward Lee about getting deterministic results in the presence of concurrency. I even found the earliest paper with a mathematical proof that write and read must be separated in time or space (the math basics of the rust borrow checker), but don't find it anymore.
Yeah, passing by value or "Value semantics" can prevent many programming errors. Passing references to immutable data can serve a similar purpose. In low-level languages where memory layout and calling convention map to target hardware, there are differences in performance to consider.
Pass by value would indeed make a big difference to how programs are structured and make it easier to reason about programs.
I just want to point out that "concurrency safety" is very much a word, although "thread safety" is more common. These are broadly part of memory safety, which is a topic mainly due to security concerns but also academic study.
The two perspectives are not perfectly congruent. Non-concurrency-safe languages like go can also be considered broadly memory safe. The pragmatic rationale is that data races in GCed languages are much less exploitable. From a academic, principle based view this is unsatisfying and unconvincing as one would prefer safety to be matter of semantics. See also https://www.ralfj.de/blog/2025/07/24/memory-safety.html
Rust uses "fearless concurrency" as a slogan. Rust offers more options than passing by value (Copy) while still guaranteeing safety through static type checking.
There is also research for GCed languages to establish non-interference eg Scala capture checking.
Concurrency is recognized as difficult (at least by people who are knowledgable) and programs language design usually involves pragmatic choices if you need concurrency. If the language does not provide the primitives or spec that enables safety, then you are left with patterns and architecture.
The science is still evolving, it is certainly not the case that nobody cares. Rather, progress is slow and moving ideas from research industry is even slower. How much value we ascribe to correctness, safety and performance in industry depends very much on the context.
The Alan Kay viewpoint (he is NOT the inventor of OOP [1]) is considered the least helpful viewpoint on OO design. The āmagicalā and unhelpful āits all about messagesā perspective, that helps you not at all unless one is talking about the internal implementation of a platform like Smalltalk. Consider the views of the real inventors - Nygaard and Dahl.
[1] I don't think I invented "Object-oriented" but more or less "noticed" what was really powerful about just making everything from complete computers communicating with non-command messages. This was all chronicled in the HOPL II chapter I wrote "The Early History of Smalltalk". ā Alan Kay
Let's not have dashboard access the temperature by doing `GetSurroundingCar().engine.temperature`
If the dashboard needs to get the temperature from a sensor in the engine, it should be able to "talk" to the sensor, without going through car object.
In ideal OOP, a "method call o.m(...)" is considered a message m being sent to o.
In practice, field access, value and "data objects" etc are useful. OOP purism isn't necessarily helping if taken to the extreme.
The pure OOP idea emphasizes that the structure of a program (how things are composed) should be based on interactions between "units of behavior".
1. Avoid passing live instances (by reference) to other instances as much as possible. Because you don't want many instance references to be scattered too widely throughout your codebase. This can cause 'spooky action at a distance' where the instance state is being modified by interactions occurring in one part of the code and it unexpectedly breaks a different module which also has a reference to that same instance in a different part of the codebase. The more broadly scattered the reference is throughout the codebase, the harder it is to figure out which part of the code is responsible for the unexpected state change. These bugs are often very difficult to track down because stack traces tend to be misleading because they don't point you to the event which led to the unexpected state change which later caused the bug.
2. Avoid overly complex function parameters and return values. Stick to passing simple primitives; strings, numbers, flat objects with as few fields as necessary (by value, if possible). Otherwise, it increases the coupling of your module with dependent logic and is often a sign of low-cohesion. The relationship between cohesion and coupling tends to be inversely proportional. If you spend a lot of time thinking about cohesion of your modules (I.e. give each module a distinct, well-defined, non-overlapping purpose), the loosely-coupled function interfaces will tend to come to you naturally.
The metaphor I sometimes use to explain this is:
If you want to catch a taxi to go from point A to point B, do you bring a steering wheel and a jerry-can of petrol with you to give to the taxi driver? No, you just give them a message; information about the pick up location and destination. This is an easy to understand example. The original scenario involves improper overlapping responsibilities between you and the taxi service which add friction. Usually it's not so simple, the problem is not so familiar, and you really need to think it through.
We understand intuitively why it's a bad idea in this case because we understand very well the goal of the customer, the power dynamics (convenience of the customer has priority over that of the taxi driver), time constraints (customer may be in a hurry), the compatibility constraints (steering wheel and fuel will not suit all cars). When we don't understand a problem so well, an optimal solution can be difficult to come up with and we usually miss the optimal solution by a long shot.
nice post, lately ive been dealing with concurrency, between threads and processes. trying to keep it cross platform as well, its a lot to learn. if you have large buffers and want to keep some semblance of performance, its VERY interesting understanding all the transfer mechanisms and cache levels involved. i feel these are the sorts of things my education skipped, it was all very focused on the static structure of objects not the dynamics of data transfer.
> More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing.
That's one thing I never care to do unless I'm the one making the technical decisions. What I do is to build the thing, but with defensive programming in place. I take care of making that my code is good, then harden any interface so that I can demonstrate that I'm not the cause for new bugs. People will be careless, so make sure that you have blast doors between your work and theirs.
And I do take time to learn about the abstractions of the new shiny tools, even when it's overengineered. Going blind and making mistakes is not my cup of tea.
"Any problem in computer science can be solved with another layer of indirection, except of course for the problem of too many layers of indirection." Bjarne Stroustrup
That's why you see hundred level call stacks, polymorphism with a single implementation and still errors are hidden or root causes hidden behind "exception caught".
> In the world of computing, we tend to abstract away complexity. Doing so seems liberating. It enables us to focus on the bigger picture. Unfortunately, in doing so, the fidelity of our understanding often decreases. We sometimes end up blinding ourselves.
Some āJava in the 90sā understanding of abstraction. Proper abstractions break complexity into composable elements. Hence, fidelity of our understanding increases.
just share the damn thing, someone may have something for you ;)
...I've kind of rarely seen these ppl complaining about work actually sharing their resume or a condensed description of their skills, knowledge and experience
ok, f googled it and found it: ~"entry-level/junior sysadmin and cyber"
so, a path could be picked from what you know:
1. devops/sre - really hard to get above entry-level without real experience and you _will_ be competing head on with AI ...ouch
2. cyber - same with whitehat as with devops/sre ...basically go full red-team / blackhat / offesinve for a while, the get certs and portofilio, then job in "real cyber" ...BUT ppl that do this tend to have a "very specially broken brain", so if you haven't done this already you're probably not one of them [probably for the best]
...but they're probably all bad, so better DO SOMETHING ELSE ENTIRELY:
...gtfo of software, you're likely not gonna become an "agents hearder" with skillset, mentality and experience - in the US probably going full on on agriculture [recent US protectionism and isolationism will give you decent levels and shield for globalized markets], learning some minimal hardware tinkering to automate drones and later manage android workers, software for planning farming automation etc... hire hands for physical labour and BUILD AND MANAGE A FARM or something like that (maybe farm + restaurant or smth else form tourism / hospitality)
All of the three sectors you've mentioned are not in a good place right now. Probably much less stressful to be an unemployed programmer than trying to make a hobby-scale farm profitable with soaring fuel and fertilizer prices, along with a labor force that is fleeing.
E: Farm automation probably has some juice though, regardless of how close the androids I keep seeing in demos actually are.
It's not just tech, other industries are experiencing the same hiring woes. I think the economy is deeply broken, it shouldn't work like this and it doesn't seem like there is any hope in fixing it - governments just continue to run up debt as if they can just keep kicking the can down the road indefinitely. eventually the can becomes a brick and you break your foot.
there will be a reset at some point, and software developers will be needed. especially when every piece of software stops working. idk if that will happen before or after an economic collapse tho.
i have no idea where things will go in the future, but i doubt it will be much fun
Oof. There are two pieces to this story. One is great and one his heartbreaking.
The fact that modern tech has disintermediated people with problems to solve from the need for a "priest class" to commune with the machine to solve the problem is a great thing. It's the goal. The more we do it the better we are making the world for humans.
... the fact that people need to work to eat or provide anything above a subsistence quality of life is not only tragic, it's increasingly abhorrent in a world where automation and simplification via machines has freed up this much raw resource and free time.
If we're pitting LLMs against people's ability to provide for their families, we have lost the thread on why we're doing any of this.
Those resources are being redirected to create entertainment areas for the rich like golf courses, 7 star luxury hotels and villas. This is the modern predicament.
Not he automation, but the way... we gone farther since agricultural and energy domestication... but the profit as main director is less than suboptimal, it is tragical. Having known about many accidents in complex systems is a madness to see things at this point in the most complex of systems that is society.
I may be missing something, but this doesn't read to me like an abstraction or AI-related problem.
It sounds more like a packaging issue. I know he's attempted to edit his resume, but there's missing information here that OP may not even be aware of.
For instance, I recently became the last of two candidates interviewing for a great opportunity that I sadly lost. When I received feedback, it turned out the hiring committee had a completely different sense of one aspect of my work than I had attempted to convey. I'm glad I got the feedback, but it was frustrating to lose after so many interviews.
Then just recently, I interviewed a candidate at my current company who reminded me of OP. Laid off worker, very nice guy, but he had no idea how to portray himself as a dev at the level he was applying for.
I wanted to call him up and coach him, but it didn't seem appropriate, especially since he didn't ask for feedback.
If you are in this position, find a free coaching program that can help you revamp and resell what you have to offer.
It's not fair to have to do that just to get a chance to be paid a fair wage. But companies get thousands of resumes a month and do dozens of interviews.
We try to give candidates a chance to show us who they are, but if what they are showing us doesnāt line up with the role, or their strengths are buried, thereās only so much we can infer. It sucks, because the resume and interview are not the job. But they are the gate you have to get through before anyone sees the work.
I hope I donāt come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didnāt have a ton of competition. In that kind of environment, there is space for a focus on the actual fundamentals, the things-in-themselves, the theory behind the action. Most of this stuff is good and it was a beneficial situation to have that buffer space to allow it.
But ultimately business reality has changed, largely because achieving business goals is dramatically easier with AI tools. This undercuts a lot of the focus on building solid fundamentals, and in a lot of cases thatāll come back to bite the business. But in many scenarios it wonāt, and the industry will rumble on.
Those of us working in marketing or journalism or education have already been forced to accept this new reality decades ago, largely because of inventions by software developers. Now devs are just late to their own party.
I've worked 10+ years as a developer in France where salaries weren't too high to begin with, but I certainly noticed the added competition as it was harder to find a job. I stopped "fighting" for a high-paying role so my experience didn't provide net gains but it still protected me from inflation. The net "gains" rather came from spending less by moving from a rent to a mortage and then making it smaller.
I'm OK with this now, it is what it is, but these years weren't smooth as there were ups and downs and a down after an up can be stressful if you're not ready for it.
In many European countries there aren't high salaries and above-average job security for developers, you are considered an office worker like everyone else, this isn't Silicon Valey over here, especially if you come from the Southern Europe countries.
āThe actual fundamentals, the things-in-themselves, the theory behind the actionā donāt go away, they change.
Programmers used to work with punch cards, then assembly, then low-level languages with odd quirks. Today few developers even think about first-party code size, micro-optimizations, register allocation, etc. LLMs are just another abstraction.
A developer with the ideal AI code writer (which weāre not at yet) must still think about idea, design, scope, etc. like a product owner or manager. And these concepts have theory, sometimes even math (e.g. time complexity).
EDIT to comment on the article: all abstractions are leaky, but sometimes it rarely matters. Today we do still need to understand code quality and architecture when working with LLMs, or the software will get bad enough that it will affect the company. But maybe not next year. An analogy: stack vs heap, memory allocations, etc. still matter in high-performance software, which isnāt uncommon, but programmers almost never think about register allocation.
LLMs are not another abstraction. ALL OTHER LAYERS you named are fully deterministic, understood, debuggable, etc.
You cannot be serious.
This isnāt harsh at all. As Iāve commented before (but this time as well I do no have the receipts/links), itās been reported that highly paid programmers in the US also brought in a ton of profit; it was not at all the case that their employers had some thin profit margins because the labor was expensive to them. We talking one million USD profit for a 100K USD salary.
They didnāt even earn anything close to what they were worth. According to Marxā Labor Theory of Value anyway.
However the dice fall now, one of the possible outcomes is that the tech billionaires take that 100K USD for themselves. The very deserving individuals whose job is to sit their arses on automation assets.
Meanwhile workers from other sectors can gloat about how they are now in the same boat as them. The boat of accepting your ever-meagre reality.
> that highly paid programmers in the US also brought in a ton of profit
In Germany for instance I've seen many a company that treated their programmers as a cost center and they actually were (probably a mutually reinforcing self-fulfilling prophecy).
Too many instances of programmers being deployed in such a way that I couldn't possibly see a way that they would get back even that meagre investment that was being made. Fully irrational dev teams doing useless busy work.
Most German "startups" used to be replaceable with Zapier and Pipedrive. That has probably only gotten worse with the advent of LLMs.
Or the margins shrink significantly as the space becomes more mature and competitive, and that surplus mostly goes away.
I'm in a similar position to the OP, unemployed for about 10 months, with tons and tons of applications sent both remote and local, and yeah not sure where this is gonna go or what I'm supposed to do. Also disabled, my eyes don't work so that automatically removes many, many non-software jobs I'd otherwise do from the equation.
Don't even really have anything else to say other than that, but maybe commenting it somewhere helps someone else realize they're not alone. I don't know how that helps you or me, but that's what I got. Maybe there's still something for us somewhere, but it is very difficult to stay motivated, and I don't have an answer.
I'm not in your situation, but I've hit the bottom of the despair and found the inner "fuck it we ball" within me. I don't know what's an option for you, but I'm learning bartending, stocking shelves, and having irresponsible sex with the young women I work with in retail.
I enjoy software development and hopefully one day I will return to it, but I am but one tiny kernel of corn in such a mighty ocean of shit so I might as well right the waves instead of fighting them. Maybe your calling is scamming Indians or scamming Americans or scamming Indian scammers. You aren't alone but the attitude you have will never stop mattering. See if you want to go back to school, start a tutoring program for kids. Motivation is for morons, do something.
Its only 11:37am where I am, but this is the sanest thing I've heard so far today.
I had to burn out to obtain the insight :-/
I agree with your message but maybe donāt have sex with young co-workers.
I'm guessing everybody in this interaction is an adult.
Why?
Only violate one proverb at a time. If you shit where you eat, don't rob the cradle. If you rob the cradle, don't shit where you eat.
...puritans will be puritans
From the well-written article:
A major quandary companies are finding themselves in is "resume fraud", which can be defined here as being inundated with applicants only to find 99%+ have used GenAI to produce a bogus work history tuned to satisfy the job posting. To the point where many companies simply give up trying to identify "real" applicants via online submissions.It is analogous to email spam in the 90's, before anti-spam technology was mature.
Yeah it's pretty bad. Can't even browse projects on Reddit because a lot of them are just slop
Oddly enough, the solution lies in what was previously replaced; staffing firms.
Staffing companies have recruiters which vet candidates to varying degrees of success. At minimum, they establish the candidate:
- is a human
- lives where they claim to live
- has worked where they claim to have worked
- has eligibility to work for one or more of their clients
If nothing else, the above eliminates much of the "99% resume fraud" problem companies are dealing with now.
I've been thinking this for a while now, but I feel like especially with the rise of crazy salaries in AI research, it's time for software development to have its agency moment. Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent, and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.
Iāve seen a setup like this in software for some specific high-demand SAP (ERP) consultancy roles. The nature of SAP migrations are per-project in nature (you wouldnāt want to migrate your companyās ERP all the time). The person had such a skillset that they had what is effectively an āagentā who would negotiate their next job assignment. The agent was even baked into the contract with the client as a party, I donāt recall how much of the hourly this agent would get, but they were invoicing the company separately.
At least this is what I recall.
Meta: this is probably the first time this year where I use the word agent to refer to a human. Feels odd even.
> Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent ...
This is what recruiters in quality staffing firms do. Granted, there are many staffing firms which are worthless body-shops. But those are not reputable. :-)
> ... and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.
The equivalent has traditionally been hiring managers who work with approved staffing companies, both to ensure those companies provide value as well as to foster an understanding of the people/skills needed by the organizations.
Wise organizations use multiple staffing firms and perform internal audits in order to minimize complacency/corruption.
I feel the pain, and if I get unemployed now on my 50's, most likely I will do something else outside computing.
Everyone that praises how they get more productive always forgets that means big corp now needs less of us.
I work on enterprise consulting, and have watched how the change into managed cloud infrastructure, followed by low-code/no-code tooling, has had an impact on team sizes, meaning less devs for the same outcome.
AI driven development is reducing those team sizes even further.
In many European countries, gettting jobs at a later age is really an almost impossible task, the easiest solutions end up trying to get early retirement status, or go self employed, which also isn't without its own set of complications.
More than ever is time to be stoic. Have things but live as having nothing. But as obvious as the author says it was predictable too.
By now... I see in my country high prices for laptops with only 4Gb of Ram and Celerons.
It could do wonderful things if in 2000s people didn't buy the argument that hardware is so cheap so lets write unefficient code. Same hardware that could play an Youtube video in 2000s today cannot even open the website. Electron send hugs...
Now people are mad about AI until when? Oceans be drought like in Oblivion movie?
And professionals? The generation of specialists will pass... and people will blindly depend on Ai soon if the course of things doesn't stop or at least be corrected.
I think the author could have brighter days in future (and still thing in present in some hidden niches) as knowledge will always precious.
The main lesson I have is buy less TI and every buzz promises and find the place where knoledge and craft walk side by side.
But all that advice. Never worked out. Be stoic. Be supportive. Be.. to Not to be. Accept that beeing some eldritch gods lunch is your destiny. Stoic is what we expect the cattle to be as it goes up the ramp. Do not go quietly into the night, rage against the dying of the light.
The great meat grinder doesn't care either way. Stoic or screaming and kicking, you're going in.
So sayeth a Head of Engineering.
Correct. Stoicism was for two audiences. Those doing the killing to be indifferent toward it and for those being killed to be indifferent about it.
Marcus Aurelius the historical figure was a monster who killed a measurable portion of humans alive at the time
I mean... so did a lot of other rulers. As far as emperors go, Aurelius wasn't that bad. You have to judge historical people by their peers, not by your own modern standards.
Stoicness is for the lamb and the wolf alike; mindfulness is for the monk and the samurai killing on behalf of their lord alike.
We are not so one-dimensional that good mind habits are the one and only thing we do and act on.
But that was a stoic comment.
I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies as more of a liability than a virtue.
More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing. They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.
We've been moving to React, replacing an internal framework that's worked wonders for us we've been using for over a decade. The biggest part of the move is "hiring".
My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".
Everything is giant overbuilt and terrible because most people never bothered to learn even a single level up from where they do most of their work. The people that do become unhirable. Everything takes hundreds or thousands more cycles and electricity it should because people can't be bothered to understand what they're doing.
> They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.
That's because they don't know what to build that will be a successful product, so they essentially try to brute force this question of "what to build" by trying different ideas quickly and see which one will stick. And in this quick iteration loop people just throw bunch of stuff together to make something and once that something gains traction they will keep piling on top of that shaky foundation.
Well, if more react devs knew how it worked under the hood they might choose something else[1] :-)
Jokes aside, if you don't need two-way data binding, using react frameworks pulls in a lot of crap that you never need.
The majority of web apps have no need for react
ā--------
[1] I always joke that the reason I am atheist is not because I don't know much about your religion, it's because I know too much about your religion.
Hardware is cheap ; human labor is not. Companies have figured out that the best way to extract monye from customers is to give them something that barely works now, rather then something that works great later.
That's not true if you are on cloud. Clumsily written software becomes really expensive to run.
> Hardware is cheap ; human labor is not.
Especially true when you're paying for neither hardware nor labor.
Writing inefficient client-side software, whether it's desktop or webshit, makes the customers / users pay for the hardware, and pay with their time.
Is it, though? Can we really keep saying that "hardware will always be cheaper than human labour" when RAM prices are soaring, GPUs are becoming prohibitively expensive, and we're looking at a probably chip shortage?
I think the era of "poor software for fantastic hardware" is coming to an end.
RAM + GPU are getting more expensive but mostly for applications that require a lot of it like AI. The hardware cost for regular applications has not vastly increased (especially when factoring in inflation). Spending 2x development time on a problem often is not worth it (or only with large deployments).
UI development is an even more special case here. The customer buys the machine which runs the code, not the company. So sadly "good enough" is the standard.
One example for me here is the "switch product option" button on Amazon listings (e.g. switch green to blue color, smaller to larger model). On my phone this sometimes takes >5 seconds to properly load. Horribly optimised.
> I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.
This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.
I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.
What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.
> My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".
The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.
There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.
> Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn.
Design time, code time, compile time, run time. Why all that potentially wasteful upfront work?
The next step are shipped applications whose help menu is a chat interface that responds to all user questions of the form "How do I ...", with a short pause to add a new hack to the existing pile, and then some upbeat instructions.
In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.
> In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.
If I had to sum up the zeitgeist of the '90s techno-optimism it would be this persistent, confident prediction that once people just learned _how_ to use computers, and everyone is a power user everything will be fine! Despite the mounting evidence that actually, no, like everything else in reality the distribution of skill is a bell-curve with the median sitting uncomfortably low for those who, to quote OP, "lived on IRC or in the bash terminal".
Free universal education didn't fix this problem, LLMs won't fix this problem. Man's natural paucity is no longer in the availability or accessibility of knowledge. The liberal ideal that all we must do is empower the individual turns out to not have been the solution to everything forever.
But hey, being self-aware enough to make productive use of this new technology is probably _some_ kind of edge.
May as many as possible survive.
I can confidently say that I know little to no people truly interested in understanding technology, except for strangers online.
Your words resonate with me. Even before LLMs, Iāve been disappointed with the general direction the software industry took in the 2010s. Todayās software industry is not the industry of Licklider, Engelbart, Bob Taylor, Alan Kay, Woz, Stallman, Ritchie, Thompson, Pike, Joy, and many others whom I admire, who helped establish an ethos of computing that fostered a sense of freedom, creativity, and wonder.
Instead, what we have today is a computing ecosystem dominated by powerful players who care about money and control. Speaking from the standpoint of a Bay Area resident, since roughly 2012, the field has been increasingly taken over by people who are in it for the money. Combine that with Alan Kayās observation that computer science is a āpop cultureā that often lives in the moment and has little regard for the past, and also combine that with the āmove fast and break thingsā attitude that permeates modern software development, and this has created an environment that seems hostile to the types of nerdy pursuits that the industry once encouraged. The working environments of many major software companies and the products they release are a reflection of the values of the companiesā executives, managers, and shareholders.
While Iām not anti-AI, I see agentic coding as another step in the direction that the software industry was already heading towards, where it can move even faster and break even more things.
There is still wonder, joy, and freedom in computing, but I feel this is increasingly confined to the hobbyist world and certain niches in research environments.
sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).
Thank you, I was starting to wonder.
I guess because Iām in game dev maybe, but in all my jobs knowing about the underlying stack has either been necessary knowledge or highly regarded.
I canāt think of any time in my career where knowing about the internals of the stack was ever frowned upon or where itās been anything other than an advantage (especially when hunting bugs). I must have been lucky.
people will accuse you of "gatekeeping" because you shouldn't need to have any knowledge or skill to do stuff. those things are unimportant, even bad, because anything requiring those is inherently exclusionary. lmao.
This has been obvious to me since I graduated with a BIT majoring in 'Software design.' I literally went to university with software design and software architecture being my core interests.
When I graduated, I was shocked to learn that no company cared about any of the architectural concepts that I had learned. UML class diagrams, sequence diagrams, ER diagrams, etc... had been on the way out. At one point, as internet companies where scaling up, there was a brief resurgence of interest in sequence diagrams... Especially as a communication method when explaining complex bugs or complex message-passing scenarios. But it didn't really last. Nowadays most software is riddled with race conditions and deep exploitable architectural flaws. Cryptocurrencies have been victims to many such attacks. Billions of dollars have been lost to race conditions... And that's just the ones which were discovered. They are notoriously difficult to find post-implementation.
The programming primitives that we're using today aren't optimized to avoid race conditions or even try to encourage good concurrency patterns; quite the opposite; they encourage convenient but disorganized parallelization and they're optimized to put the focus on type safety which is a far less concerning issue. A lot of people who were rightly alarmed by gaps in schema validation (which is critical at API boundaries) became overly obsessed with type safety (which is a broader concern). I have built some async primitives for Node.js, nobody cared! NOBODY! Other developers have had the same experience with most other languages. I think only a few niche languages like Elixir actually treated it as important. But nobody even acknowledged that the problem could be remedied in existing languages. It's so bad that it seems as though some people wanted it to be that way.
The term 'concurrency safety' doesn't even exist! Some have a vague idea about thread-safety OK, that's very specific to one particular concurrency primitive... but what about the concurrency of asynchronous logic (much more common nowadays)? I have felt thoroughly suppressed in that regard in my career.
The only voice on the subject of architecture that got through to the 'mainstream' was Martin Fowler (one of the inventors of Agile software development). After that, there was Dan Abramov of Redux fame. Some notable opinionated architecture books were published but none really identified the underlying essential philosophy to good architecture.
The best, most succinct quote I ever read on the subject was from Alan Kay (inventor of OOP) who said "I'm sorry that I long ago coined the term 'objects' for this topic because it gets many people to focus on the lesser idea. The big idea is messaging."
I like that quote for many reasons; firstly because it shows wisdom, secondly, it tells you what the issue is, very simply and, thirdly, it hints at the importance of 'focus' in this discipline where we are saturated with thousands of complex overlapping and partially conflicting ideas.
I think the FP trend was somewhat of a red herring. Same with Type Safety. Yes, they were useful to some extent, there are some really good ideas in there, but people got so caught up in them that the most fundamental area of improvement was ignored entirely. To me, the core value proposition of FP can be reduced down to "pass by value is safer than pass by reference." Consider that in the context of Alan Kay's "The big idea is messaging." - Is an object reference a message? NO! A live instance is not a message! Precisely! His point supports pass-by-value, furthermore, it encourages succinct/minimal parameters.
Good architecture is rooted in 2 core concepts. 1. Loose coupling. 2. High cohesion and you achieve those by separating logic + structure from messaging. The biggest mistake people make it passing around structure and logic as parameters to other logic. You should avoid moving around logic and structure at runtime; only messages should move between objects; the simpler the messages, the better. And note that 'avoid' doesn't mean never but it means you have to be extremely careful when you do violate this principle and there should be a really good commercial reason to do so. I.e. You should exhaust other reasonable approaches first.
My journey is quite similar. My mental model got a huge boost after I read and understood Leslie Lamports early work and the work of Edward Lee about getting deterministic results in the presence of concurrency. I even found the earliest paper with a mathematical proof that write and read must be separated in time or space (the math basics of the rust borrow checker), but don't find it anymore.
- https://lamport.azurewebsites.net/pubs/time-clocks.pdf
- https://en.wikipedia.org/wiki/Chandy%E2%80%93Lamport_algorit...
- https://www2.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-...
Yeah, passing by value or "Value semantics" can prevent many programming errors. Passing references to immutable data can serve a similar purpose. In low-level languages where memory layout and calling convention map to target hardware, there are differences in performance to consider.
Pass by value would indeed make a big difference to how programs are structured and make it easier to reason about programs.
I just want to point out that "concurrency safety" is very much a word, although "thread safety" is more common. These are broadly part of memory safety, which is a topic mainly due to security concerns but also academic study.
The two perspectives are not perfectly congruent. Non-concurrency-safe languages like go can also be considered broadly memory safe. The pragmatic rationale is that data races in GCed languages are much less exploitable. From a academic, principle based view this is unsatisfying and unconvincing as one would prefer safety to be matter of semantics. See also https://www.ralfj.de/blog/2025/07/24/memory-safety.html
Rust uses "fearless concurrency" as a slogan. Rust offers more options than passing by value (Copy) while still guaranteeing safety through static type checking.
There is also research for GCed languages to establish non-interference eg Scala capture checking.
Concurrency is recognized as difficult (at least by people who are knowledgable) and programs language design usually involves pragmatic choices if you need concurrency. If the language does not provide the primitives or spec that enables safety, then you are left with patterns and architecture.
The science is still evolving, it is certainly not the case that nobody cares. Rather, progress is slow and moving ideas from research industry is even slower. How much value we ascribe to correctness, safety and performance in industry depends very much on the context.
> only messages should move between objects
Can you provide an example for this?
The Alan Kay viewpoint (he is NOT the inventor of OOP [1]) is considered the least helpful viewpoint on OO design. The āmagicalā and unhelpful āits all about messagesā perspective, that helps you not at all unless one is talking about the internal implementation of a platform like Smalltalk. Consider the views of the real inventors - Nygaard and Dahl.
[1] I don't think I invented "Object-oriented" but more or less "noticed" what was really powerful about just making everything from complete computers communicating with non-command messages. This was all chronicled in the HOPL II chapter I wrote "The Early History of Smalltalk". ā Alan Kay
Say you have a Car, Engine and Dashboard object.
Let's not have dashboard access the temperature by doing `GetSurroundingCar().engine.temperature`
If the dashboard needs to get the temperature from a sensor in the engine, it should be able to "talk" to the sensor, without going through car object.
In ideal OOP, a "method call o.m(...)" is considered a message m being sent to o.
In practice, field access, value and "data objects" etc are useful. OOP purism isn't necessarily helping if taken to the extreme.
The pure OOP idea emphasizes that the structure of a program (how things are composed) should be based on interactions between "units of behavior".
1. Avoid passing live instances (by reference) to other instances as much as possible. Because you don't want many instance references to be scattered too widely throughout your codebase. This can cause 'spooky action at a distance' where the instance state is being modified by interactions occurring in one part of the code and it unexpectedly breaks a different module which also has a reference to that same instance in a different part of the codebase. The more broadly scattered the reference is throughout the codebase, the harder it is to figure out which part of the code is responsible for the unexpected state change. These bugs are often very difficult to track down because stack traces tend to be misleading because they don't point you to the event which led to the unexpected state change which later caused the bug.
2. Avoid overly complex function parameters and return values. Stick to passing simple primitives; strings, numbers, flat objects with as few fields as necessary (by value, if possible). Otherwise, it increases the coupling of your module with dependent logic and is often a sign of low-cohesion. The relationship between cohesion and coupling tends to be inversely proportional. If you spend a lot of time thinking about cohesion of your modules (I.e. give each module a distinct, well-defined, non-overlapping purpose), the loosely-coupled function interfaces will tend to come to you naturally.
The metaphor I sometimes use to explain this is:
If you want to catch a taxi to go from point A to point B, do you bring a steering wheel and a jerry-can of petrol with you to give to the taxi driver? No, you just give them a message; information about the pick up location and destination. This is an easy to understand example. The original scenario involves improper overlapping responsibilities between you and the taxi service which add friction. Usually it's not so simple, the problem is not so familiar, and you really need to think it through.
We understand intuitively why it's a bad idea in this case because we understand very well the goal of the customer, the power dynamics (convenience of the customer has priority over that of the taxi driver), time constraints (customer may be in a hurry), the compatibility constraints (steering wheel and fuel will not suit all cars). When we don't understand a problem so well, an optimal solution can be difficult to come up with and we usually miss the optimal solution by a long shot.
nice post, lately ive been dealing with concurrency, between threads and processes. trying to keep it cross platform as well, its a lot to learn. if you have large buffers and want to keep some semblance of performance, its VERY interesting understanding all the transfer mechanisms and cache levels involved. i feel these are the sorts of things my education skipped, it was all very focused on the static structure of objects not the dynamics of data transfer.
> replacing an internal framework that's worked wonders for us we've been using for over a decade
Can you share what this internal framework is?
> The biggest part of the move is "hiring".
By that they mean outsourcing.
> More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing.
That's one thing I never care to do unless I'm the one making the technical decisions. What I do is to build the thing, but with defensive programming in place. I take care of making that my code is good, then harden any interface so that I can demonstrate that I'm not the cause for new bugs. People will be careless, so make sure that you have blast doors between your work and theirs.
And I do take time to learn about the abstractions of the new shiny tools, even when it's overengineered. Going blind and making mistakes is not my cup of tea.
Premature abstraction is the root of all evil.
"Any problem in computer science can be solved with another layer of indirection, except of course for the problem of too many layers of indirection." Bjarne Stroustrup
That's why you see hundred level call stacks, polymorphism with a single implementation and still errors are hidden or root causes hidden behind "exception caught".
> In the world of computing, we tend to abstract away complexity. Doing so seems liberating. It enables us to focus on the bigger picture. Unfortunately, in doing so, the fidelity of our understanding often decreases. We sometimes end up blinding ourselves.
Some āJava in the 90sā understanding of abstraction. Proper abstractions break complexity into composable elements. Hence, fidelity of our understanding increases.
> I have spent months adjusting my resume
just share the damn thing, someone may have something for you ;)
...I've kind of rarely seen these ppl complaining about work actually sharing their resume or a condensed description of their skills, knowledge and experience
ok, f googled it and found it: ~"entry-level/junior sysadmin and cyber"
so, a path could be picked from what you know:
1. devops/sre - really hard to get above entry-level without real experience and you _will_ be competing head on with AI ...ouch
2. cyber - same with whitehat as with devops/sre ...basically go full red-team / blackhat / offesinve for a while, the get certs and portofilio, then job in "real cyber" ...BUT ppl that do this tend to have a "very specially broken brain", so if you haven't done this already you're probably not one of them [probably for the best]
...but they're probably all bad, so better DO SOMETHING ELSE ENTIRELY:
...gtfo of software, you're likely not gonna become an "agents hearder" with skillset, mentality and experience - in the US probably going full on on agriculture [recent US protectionism and isolationism will give you decent levels and shield for globalized markets], learning some minimal hardware tinkering to automate drones and later manage android workers, software for planning farming automation etc... hire hands for physical labour and BUILD AND MANAGE A FARM or something like that (maybe farm + restaurant or smth else form tourism / hospitality)
All of the three sectors you've mentioned are not in a good place right now. Probably much less stressful to be an unemployed programmer than trying to make a hobby-scale farm profitable with soaring fuel and fertilizer prices, along with a labor force that is fleeing.
E: Farm automation probably has some juice though, regardless of how close the androids I keep seeing in demos actually are.
With some knowledge in devops and cyber maybe moving to QA, tester could work too. But the idea to move towards agro is a good idea too!
You should link to your resume
Can't offer you any work unfortunately, but have an updoot. Hope this gets to the top and helps you provide for your son.
It's not just tech, other industries are experiencing the same hiring woes. I think the economy is deeply broken, it shouldn't work like this and it doesn't seem like there is any hope in fixing it - governments just continue to run up debt as if they can just keep kicking the can down the road indefinitely. eventually the can becomes a brick and you break your foot.
there will be a reset at some point, and software developers will be needed. especially when every piece of software stops working. idk if that will happen before or after an economic collapse tho.
i have no idea where things will go in the future, but i doubt it will be much fun
I'm confident the world will need more software developers than ever before, no matter where "AI" goes from here.
I don't think most of those jobs will be in the West, though.
Oof. There are two pieces to this story. One is great and one his heartbreaking.
The fact that modern tech has disintermediated people with problems to solve from the need for a "priest class" to commune with the machine to solve the problem is a great thing. It's the goal. The more we do it the better we are making the world for humans.
... the fact that people need to work to eat or provide anything above a subsistence quality of life is not only tragic, it's increasingly abhorrent in a world where automation and simplification via machines has freed up this much raw resource and free time.
If we're pitting LLMs against people's ability to provide for their families, we have lost the thread on why we're doing any of this.
> this much raw resource and free time.
Those resources are being redirected to create entertainment areas for the rich like golf courses, 7 star luxury hotels and villas. This is the modern predicament.
Not he automation, but the way... we gone farther since agricultural and energy domestication... but the profit as main director is less than suboptimal, it is tragical. Having known about many accidents in complex systems is a madness to see things at this point in the most complex of systems that is society.
Profit is what drives the survival of the firm to be fair
However there are tasteful ways of doing it. And google and meta in particular certainly are not.
I may be missing something, but this doesn't read to me like an abstraction or AI-related problem.
It sounds more like a packaging issue. I know he's attempted to edit his resume, but there's missing information here that OP may not even be aware of.
For instance, I recently became the last of two candidates interviewing for a great opportunity that I sadly lost. When I received feedback, it turned out the hiring committee had a completely different sense of one aspect of my work than I had attempted to convey. I'm glad I got the feedback, but it was frustrating to lose after so many interviews.
Then just recently, I interviewed a candidate at my current company who reminded me of OP. Laid off worker, very nice guy, but he had no idea how to portray himself as a dev at the level he was applying for.
I wanted to call him up and coach him, but it didn't seem appropriate, especially since he didn't ask for feedback.
If you are in this position, find a free coaching program that can help you revamp and resell what you have to offer.
It's not fair to have to do that just to get a chance to be paid a fair wage. But companies get thousands of resumes a month and do dozens of interviews.
We try to give candidates a chance to show us who they are, but if what they are showing us doesnāt line up with the role, or their strengths are buried, thereās only so much we can infer. It sucks, because the resume and interview are not the job. But they are the gate you have to get through before anyone sees the work.