The tractor analogy keeps coming up in these threads, and I think it's actually more pessimistic than people realize.
Tractors didn't just change farming. They emptied entire regions.
What saved the people (not the communities) was that other industries absorbed them. Factory work, services, construction. The question for software isn't whether AI creates efficiency. It's whether there's somewhere else for displaced engineers to go.
I've been writing code professionally for 16 years. The honest answer is I don't know. The optimistic scenario is that AI makes software so cheap that we build things we never would have attempted. The pessimistic one is that most of what needed building gets built, and the remaining work fits in fewer hands.
Both seem plausible. I'd bet on somewhere in between, but I'm not confident enough to tell anyone starting out that they should ignore the risk entirely.
> The pessimistic one is that most of what needed building gets built, and the remaining work fits in fewer hands.
I don't think that's true, mainly because if it were true it would have happened a long time ago. We will never settle on one version of a thing (let it be messaging, recipes, notes, image galleries, etc...). New variants emerge over time, the only thing AI does is accelerate this.
There are countless games and applications in the app stores these days. Almost all of them are money losing ventures. The vast majority of these variants are going to go extinct and earn negative revenue for its creator. The big problem comes in when creators stop running into any variants that can earn them a living at all.
>We will never settle on one version of a thing
This depends on how well a monopoly can fit into the equation.
>We will never settle on one version of a thing (recipes)
Here is an example of missing the whole elephant because you're looking to close. While the number of recipes are booming, the number of food distribution companies has collapsed into just a few mega corporations. And those corps are massively controlling the prices all of us must pay.
Let's hope you're right, but you might be underestimating the "$200 per month (robo)engineer can only do it like this, therefore this is the way to do it" factor.
What I do not understand is why is it that software engineers are so afraid? I have heard from so many other white collar people that AI already changed their job entirely (technical salesmen, translators, designers, government researchers, the list goes on), yet it is the software engineers that I hear the most noise from.
Software engineering is one of the most intellectually demanding categories of white collar work. I’m not saying it is invincible, but I do not see why SWEs should worry more.
You're on a site dominated by software engineers, in the field of software engineering, and likely have a lot of software engineer friends.
Translators got fucked, there's very little market for them compared now compared to decades past. Find their forums and I bet you'd have seen similar worry.
A lot of pride is wrapped up in the craft of writing software. If that goes away (I don't think it will) it would leave a lot of people wondering how they spent all their time.
(or something like that. Obviously I'm too well adjusted to have these existential worries)
An absolutely massive numbers of farmers were replaced too. Farm management was rather grueling in the early 1900s. Farmers that embraced mechanization were able to buy up surrounding farms that didn't and grew in size. As the equipment got better the amount of work per acre farmed dropped so farms expanded with more acreage. Farmers and hands dropped in number.
AI will replace humans in performing every cognitive task, unless you believe that there is something about biology that makes it categorically better for certain kinds of computation.
There's no reason to believe that's the case.
LLMs and specifically auto-regressive chat bots with transformers for prediction will probably not replace engineers any time soon.
They probably won't ever replace humans for the most cognitively demanding engineering tasks like design, planning, or creative problem solving.
We will need a different architecture for that, transformers don't look like they get smarter in that way even with scale.
AI will replace humans in performing every cognitive task
This is probably true, but on a time horizon that is almost certainly much much longer than we think. Centuries. Perhaps millennia, even.
It's fun to go back to the newspapers of the 1920, 30s, and 40s, and see how absolutely CERTAIN they were this was going to happen to them. I'm sure there are examples from the 19th and 18th centuries as well.
Advancement happens in fits, and then tends to hibernate until another big breakthrough.
And even when it does happen, humans love to do things, just for the sake of them. So I highly doubt art, music, literature, or any other thing humans love to intrinsically do are going away, even if they can be done by AI. If anything, they'll be done MORE as AI enables wider participation by lowering the cost and skill barriers.
> AI will replace every humans in performing every cognitive task
Maybe? I guess the better question is "when?"
>unless you believe that there is something about biology that makes it categorically better for certain kinds of computation.There's no reason to believe that's the case.
How about the fact that we don't actually know enough about the human mind to arrive at this conclusion? (yet)
Will we be able to construct a supercomputer/datacenter that can match or exceed human intelligence? Possibly, even probaby.
But that would only be one instance of such an AGI then and it would be very expensive. IMHO it will take a long time to produce something like that as a commodity.
So far it looks like AI will go the same road as other technological analogues of biological systems: not a self-contained unit (powered by currently technologically unreachable nano-mechanisms), but infrastructure that produces and maintains specialized units.
A tractor can't reproduce or repair itself, but it is better than a horse for farming. A self-driving car can't learn by itself, but a datacenter can use its data to train a new version of the car software. A humanoid robot by itself might not be flexible enough to count as AGI, but it can defer some problems to a datacenter.
IMO: Its going to. But, organizations which frame this replacement as "we're going to fire you and replace you with AI" are going to crash and burn. Instead, we're just seeing per-engineer and per-team productivity increase, and that productivity begins to outpace other bottlenecks in your company process, and you hit another wall. When faced with that second wall, some companies will naturally interpret this as "ok we don't need to hire more engineers". Other companies will try to apply AI (or hire humans) to fix that bottleneck, then go back to hiring engineers.
The dream of a Jira integration directly wired to an autonomous system to quickly close stories with no human intervention will remain a dream for a long time for anything except the lowest-level 10% of stories. Its not interactive enough; the feedback loop needs to be tighter, the vibes need to be conversational, and businesses will get the most value out of the pilot in the chair being someone who in years past called themselves a software engineer. I think we still will; the tools just change.
1. The common (and correct) claim that software engineering is not just about writing code (counter argument, with time, AI will be able to take on planning, debugging. Counter counter argument: if you ever tried just do what customers ask, you will get conflicting requirements, humans will need to help AI make decisions, not implement them)
2. Related to the above, as long as a good software engineer + AI brings more ROI than a mediocre engineer + AI that brings more ROI than a random person + AI that brings more ROI than just AI, it will be economically wise to hire more good engineers to beat your competitors who just opted to save money and fired their engineering team. Salaries might go down but for top talent, eg imagine an “AI whisperer” that can not be a 10x engineer but a 1000x because they know how to get the most out of Claud code / cursor. They will be paid accordingly.
3. Jevons paradox - perhaps making software ubiquitous, cheaper to make, will actually make software engineers in larger demand
"Will AI replace software engineers?" is not the right question and stems from a misunderstanding of how tech affects humans and how they work.
Tech is a tool. It will take away some jobs, and then create new ones. Think of a combine tractor -- it took away crop picking jobs, but created a new job of combine tractor driver. It bumps productivity.
The correct frame is "how can software engineers (or anyone, for that matter) use AI to increase my productivity?" With that frame, AI does not replace engineers; rather, engineers are in the best position to understand how it deliver products faster and implement that understanding.
Combine tractors deleted jobs. You can't say there are as many combine tractor drivers as there were crop pickers. Anyway they don't need drivers now as they're fully robotic.
The only reason society didn't collapse: there were enough other jobs to absorb those displaced workers. Will there always be?
Easy to cherry pick examples and counter-examples. See the luddites for counter-example. Artisans making high-quality textiles are no longer broadly in demand. Lots of pro examples too, I just don't find analogies helpful. It may be that like clothes, there's only so much need for software. We don't really need 1000 browsers or operating systems after all, 3 or 4 good ones is enough (and 90% of people use 1 or 2), despite there being free very good alternatives (unit costs 0, demand still low).
> It may be that like clothes, there's only so much need for software.
Clothing demand has increased greatly in the past decade due to fast fashion. Much of this clothing is designed to cost a few bucks, last a few wears, then get thrown out. It's an ecological disaster.
Maybe we'll see something similar happen with software — as production costs fall, trends will shift toward few-use throwaway software. I highly suspect this is already happening.
tl;dr it argues when there's a dramatic improvement in the efficiency of production of a good or service, its per-unit cost goes down so much that demand skyrockets, leading to greater demand for employees in that sector. The examples it gives are radiologists (after neural nets were predicted to be able to perform their jobs essentially for free), and dock workers
If this happens in the case of SWEs, it would mean a 'unit' of software will be able to be produced much more cheaply, but the demand for and price (i.e. salaries) of SWEs might stay the same or increase.
That argument is so broken it's not even funny. A lot of assumptions would have to hold for that particular outcome to be true. There are many more and most of those simply result in corporations employing fewer workers and having higher profits. I'm sure that the next argument you'd get is 'trickle down economics' but that doesn't work either.
The argument here is basically, let’s say a particular widget requires 3 human steps to make. If we replace the most expensive step with a cheaper automated alternative, the overall cost of the widget will fall and so demand for the widget will rise. And so even though the human effort to create one widget will reduce, the increase in demand will mean the reduced work per widget * new widget demand is often >= old work per widget * old widget demand.
The problem with this argument is that AI, or at least the vision of AI companies and governments are spending trillions of dollars on, purports to replace the human itself. Put another way it intends to automate all the 3 steps (as well as any ancillary services in marketing the widget, legal services in protecting the company, etc). So any increase in demand does not lead to any additional labor since the labor per unit is 0.
This video’s argument simply collapses the debate back to whether AI can largely replace human intelligence or not.
Yes, but then many young people today won't pursuing a career in SW development because of the doomsday stories... which - if your case turns out to be true - will lead to an even higher shortage of SWEs.
It reduces the amount of engineers needed. I'd say by half and web, graphic designers and front end developers coding the designs are really no longer needed.
I was just laid off from my job of 8 years in which I was the UX Researcher, Designer, Front-End Dev and Customer UX Support. In a week I have sold my house and am downsizing significantly and in two years or less will be working as an RN(nurse). I will try to get back into my field but the current administration and the many tech layoffs has flooded the market with people like me looking for job. All the while AI is eating my career & field. It just doesnt seem wise that my career of 20 years is going to be around in the next ten years.
Also, will there be interfaces we have today in five to ten years or so? My guess is AI is the interface that does everything for us through voice (Open AI's upcoming device) or text .. now we still could have handheld AI phones or devices but where AI does everything including presents articles, games we play, etc and all from these AI devices' lock screen (websites are not visited much)
Sorry to hear your situation. As a full stack Dev, I miss the days of having a UX expert on the team. Now we are just expected to create something good enough. Also your question of will there be interfaces like today makes me more nervous than anything else.
Voice is going to become more important, but I doubt it's going to be that much of a threat to websites. I can already ask Gemini in my last generation smart watch for latest news, but I much prefer reading them myself. IMO AI will be simply integrated to smart watches and phones. Maybe you'll also have more devices like Amazon Alexa at home, but I see no demand for an entirely new kind of a device/user interface.
That said, I don't have much faith in the future of my programming career either. Unless robotics gets exponentially better, registered nurses are going to be way safer from automation (at least the ones doing physical treatment).
Sure but if Gemini starts just presenting the news and all the information either via text and or graphically from Android's locks screen ... going to websites become much less.
At the macro level shifting workers to RN makes complete sense. It is the coup de grâce of the boomer generations: tax the young’s earnings, cancel SS, force them to work as nurses.
yeah and Im not young ... started my first web design job at 33 in 2009. I thought of going for pharmacy but that's too many years. Accelerated RN programs are 18 to 24 months.
Good taste. I just vibe coded some garbage in an hour. But it is so bad I can't see how to get to good code from here and so I'll spend the rest of the week doing it by hand.
now to be fair this is my first attepmt at vibe coding and so I might not know how to prompt the ai
Depends on the tool, but the first thing to do is to use a plan mode where the AI will ask you follow up question to precise the project. This gives a much, much better result than just have the AI start to work with a few lines of prompt.
That basically turns your bad prompt into a good prompt, then execute on it.
Nothing. Most software "architecture" is simply repeating the same patterns, just like writing code, just at a slightly higher level of abstraction. There's a slight moat if you are good at the business domain, and can serve as a bridge, but honestly, AI is getting pretty good at that as well.
The tractor analogy keeps coming up in these threads, and I think it's actually more pessimistic than people realize.
Tractors didn't just change farming. They emptied entire regions.
What saved the people (not the communities) was that other industries absorbed them. Factory work, services, construction. The question for software isn't whether AI creates efficiency. It's whether there's somewhere else for displaced engineers to go.
I've been writing code professionally for 16 years. The honest answer is I don't know. The optimistic scenario is that AI makes software so cheap that we build things we never would have attempted. The pessimistic one is that most of what needed building gets built, and the remaining work fits in fewer hands.
Both seem plausible. I'd bet on somewhere in between, but I'm not confident enough to tell anyone starting out that they should ignore the risk entirely.
> The pessimistic one is that most of what needed building gets built, and the remaining work fits in fewer hands.
I don't think that's true, mainly because if it were true it would have happened a long time ago. We will never settle on one version of a thing (let it be messaging, recipes, notes, image galleries, etc...). New variants emerge over time, the only thing AI does is accelerate this.
There are countless games and applications in the app stores these days. Almost all of them are money losing ventures. The vast majority of these variants are going to go extinct and earn negative revenue for its creator. The big problem comes in when creators stop running into any variants that can earn them a living at all.
>We will never settle on one version of a thing
This depends on how well a monopoly can fit into the equation.
>We will never settle on one version of a thing (recipes)
Here is an example of missing the whole elephant because you're looking to close. While the number of recipes are booming, the number of food distribution companies has collapsed into just a few mega corporations. And those corps are massively controlling the prices all of us must pay.
Let's hope you're right, but you might be underestimating the "$200 per month (robo)engineer can only do it like this, therefore this is the way to do it" factor.
What I do not understand is why is it that software engineers are so afraid? I have heard from so many other white collar people that AI already changed their job entirely (technical salesmen, translators, designers, government researchers, the list goes on), yet it is the software engineers that I hear the most noise from.
Software engineering is one of the most intellectually demanding categories of white collar work. I’m not saying it is invincible, but I do not see why SWEs should worry more.
Sampling bias.
You're on a site dominated by software engineers, in the field of software engineering, and likely have a lot of software engineer friends.
Translators got fucked, there's very little market for them compared now compared to decades past. Find their forums and I bet you'd have seen similar worry.
I also hear the most from software engineers, but then again, I don't really follow translation or government research discussion forums.
A lot of pride is wrapped up in the craft of writing software. If that goes away (I don't think it will) it would leave a lot of people wondering how they spent all their time.
(or something like that. Obviously I'm too well adjusted to have these existential worries)
Software engineers see the most dramatic change, and they haven't had to worry about job security for the last 25 years.
Tractors replaced farm hands, not farmers. Software developers are farm hands.
Why not both.
An absolutely massive numbers of farmers were replaced too. Farm management was rather grueling in the early 1900s. Farmers that embraced mechanization were able to buy up surrounding farms that didn't and grew in size. As the equipment got better the amount of work per acre farmed dropped so farms expanded with more acreage. Farmers and hands dropped in number.
AI will replace humans in performing every cognitive task, unless you believe that there is something about biology that makes it categorically better for certain kinds of computation. There's no reason to believe that's the case.
LLMs and specifically auto-regressive chat bots with transformers for prediction will probably not replace engineers any time soon. They probably won't ever replace humans for the most cognitively demanding engineering tasks like design, planning, or creative problem solving. We will need a different architecture for that, transformers don't look like they get smarter in that way even with scale.
It's fun to go back to the newspapers of the 1920, 30s, and 40s, and see how absolutely CERTAIN they were this was going to happen to them. I'm sure there are examples from the 19th and 18th centuries as well.
Advancement happens in fits, and then tends to hibernate until another big breakthrough.
And even when it does happen, humans love to do things, just for the sake of them. So I highly doubt art, music, literature, or any other thing humans love to intrinsically do are going away, even if they can be done by AI. If anything, they'll be done MORE as AI enables wider participation by lowering the cost and skill barriers.
> AI will replace every humans in performing every cognitive task
Maybe? I guess the better question is "when?"
>unless you believe that there is something about biology that makes it categorically better for certain kinds of computation.There's no reason to believe that's the case.
How about the fact that we don't actually know enough about the human mind to arrive at this conclusion? (yet)
> Maybe? I guess the better question is "when?"
And also at what cost and at what scale?
Will we be able to construct a supercomputer/datacenter that can match or exceed human intelligence? Possibly, even probaby.
But that would only be one instance of such an AGI then and it would be very expensive. IMHO it will take a long time to produce something like that as a commodity.
So far it looks like AI will go the same road as other technological analogues of biological systems: not a self-contained unit (powered by currently technologically unreachable nano-mechanisms), but infrastructure that produces and maintains specialized units.
A tractor can't reproduce or repair itself, but it is better than a horse for farming. A self-driving car can't learn by itself, but a datacenter can use its data to train a new version of the car software. A humanoid robot by itself might not be flexible enough to count as AGI, but it can defer some problems to a datacenter.
IMO: Its going to. But, organizations which frame this replacement as "we're going to fire you and replace you with AI" are going to crash and burn. Instead, we're just seeing per-engineer and per-team productivity increase, and that productivity begins to outpace other bottlenecks in your company process, and you hit another wall. When faced with that second wall, some companies will naturally interpret this as "ok we don't need to hire more engineers". Other companies will try to apply AI (or hire humans) to fix that bottleneck, then go back to hiring engineers.
The dream of a Jira integration directly wired to an autonomous system to quickly close stories with no human intervention will remain a dream for a long time for anything except the lowest-level 10% of stories. Its not interactive enough; the feedback loop needs to be tighter, the vibes need to be conversational, and businesses will get the most value out of the pilot in the chair being someone who in years past called themselves a software engineer. I think we still will; the tools just change.
I have several thoughts on this.
1. The common (and correct) claim that software engineering is not just about writing code (counter argument, with time, AI will be able to take on planning, debugging. Counter counter argument: if you ever tried just do what customers ask, you will get conflicting requirements, humans will need to help AI make decisions, not implement them)
2. Related to the above, as long as a good software engineer + AI brings more ROI than a mediocre engineer + AI that brings more ROI than a random person + AI that brings more ROI than just AI, it will be economically wise to hire more good engineers to beat your competitors who just opted to save money and fired their engineering team. Salaries might go down but for top talent, eg imagine an “AI whisperer” that can not be a 10x engineer but a 1000x because they know how to get the most out of Claud code / cursor. They will be paid accordingly.
3. Jevons paradox - perhaps making software ubiquitous, cheaper to make, will actually make software engineers in larger demand
"Will AI replace software engineers?" is not the right question and stems from a misunderstanding of how tech affects humans and how they work.
Tech is a tool. It will take away some jobs, and then create new ones. Think of a combine tractor -- it took away crop picking jobs, but created a new job of combine tractor driver. It bumps productivity.
The correct frame is "how can software engineers (or anyone, for that matter) use AI to increase my productivity?" With that frame, AI does not replace engineers; rather, engineers are in the best position to understand how it deliver products faster and implement that understanding.
Combine tractors deleted jobs. You can't say there are as many combine tractor drivers as there were crop pickers. Anyway they don't need drivers now as they're fully robotic.
The only reason society didn't collapse: there were enough other jobs to absorb those displaced workers. Will there always be?
Tech was a tool. Historically. This doesn't mean it'll stay that way.
Easy to cherry pick examples and counter-examples. See the luddites for counter-example. Artisans making high-quality textiles are no longer broadly in demand. Lots of pro examples too, I just don't find analogies helpful. It may be that like clothes, there's only so much need for software. We don't really need 1000 browsers or operating systems after all, 3 or 4 good ones is enough (and 90% of people use 1 or 2), despite there being free very good alternatives (unit costs 0, demand still low).
> It may be that like clothes, there's only so much need for software.
Clothing demand has increased greatly in the past decade due to fast fashion. Much of this clothing is designed to cost a few bucks, last a few wears, then get thrown out. It's an ecological disaster.
Maybe we'll see something similar happen with software — as production costs fall, trends will shift toward few-use throwaway software. I highly suspect this is already happening.
> trends will shift toward few-use throwaway software
software has worked this way since the rise of the internet and SaaS. consumers rarely need to install anything locally other than a browser.
AI may replace the need for coding but not the need for understanding how it works. It is as simple as that.
We still need / want to study physics to understand the universe despite the fact that it is a process that we have no control over.
AI is the same. Even when code is written by AGI you still need to be able to understand what it is and how it works.
The alternative is complete abdication and resembles a lot more like religion or even a cult.
Trust but verify.
Interesting yc vid on the topic: https://www.youtube.com/watch?v=IqwSb2hO1jE
tl;dr it argues when there's a dramatic improvement in the efficiency of production of a good or service, its per-unit cost goes down so much that demand skyrockets, leading to greater demand for employees in that sector. The examples it gives are radiologists (after neural nets were predicted to be able to perform their jobs essentially for free), and dock workers
If this happens in the case of SWEs, it would mean a 'unit' of software will be able to be produced much more cheaply, but the demand for and price (i.e. salaries) of SWEs might stay the same or increase.
That argument is so broken it's not even funny. A lot of assumptions would have to hold for that particular outcome to be true. There are many more and most of those simply result in corporations employing fewer workers and having higher profits. I'm sure that the next argument you'd get is 'trickle down economics' but that doesn't work either.
The argument here is basically, let’s say a particular widget requires 3 human steps to make. If we replace the most expensive step with a cheaper automated alternative, the overall cost of the widget will fall and so demand for the widget will rise. And so even though the human effort to create one widget will reduce, the increase in demand will mean the reduced work per widget * new widget demand is often >= old work per widget * old widget demand.
The problem with this argument is that AI, or at least the vision of AI companies and governments are spending trillions of dollars on, purports to replace the human itself. Put another way it intends to automate all the 3 steps (as well as any ancillary services in marketing the widget, legal services in protecting the company, etc). So any increase in demand does not lead to any additional labor since the labor per unit is 0.
This video’s argument simply collapses the debate back to whether AI can largely replace human intelligence or not.
Yes, but then many young people today won't pursuing a career in SW development because of the doomsday stories... which - if your case turns out to be true - will lead to an even higher shortage of SWEs.
It reduces the amount of engineers needed. I'd say by half and web, graphic designers and front end developers coding the designs are really no longer needed.
I was just laid off from my job of 8 years in which I was the UX Researcher, Designer, Front-End Dev and Customer UX Support. In a week I have sold my house and am downsizing significantly and in two years or less will be working as an RN(nurse). I will try to get back into my field but the current administration and the many tech layoffs has flooded the market with people like me looking for job. All the while AI is eating my career & field. It just doesnt seem wise that my career of 20 years is going to be around in the next ten years.
Also, will there be interfaces we have today in five to ten years or so? My guess is AI is the interface that does everything for us through voice (Open AI's upcoming device) or text .. now we still could have handheld AI phones or devices but where AI does everything including presents articles, games we play, etc and all from these AI devices' lock screen (websites are not visited much)
Sorry to hear your situation. As a full stack Dev, I miss the days of having a UX expert on the team. Now we are just expected to create something good enough. Also your question of will there be interfaces like today makes me more nervous than anything else.
Voice is going to become more important, but I doubt it's going to be that much of a threat to websites. I can already ask Gemini in my last generation smart watch for latest news, but I much prefer reading them myself. IMO AI will be simply integrated to smart watches and phones. Maybe you'll also have more devices like Amazon Alexa at home, but I see no demand for an entirely new kind of a device/user interface.
That said, I don't have much faith in the future of my programming career either. Unless robotics gets exponentially better, registered nurses are going to be way safer from automation (at least the ones doing physical treatment).
Sure but if Gemini starts just presenting the news and all the information either via text and or graphically from Android's locks screen ... going to websites become much less.
At the macro level shifting workers to RN makes complete sense. It is the coup de grâce of the boomer generations: tax the young’s earnings, cancel SS, force them to work as nurses.
yeah and Im not young ... started my first web design job at 33 in 2009. I thought of going for pharmacy but that's too many years. Accelerated RN programs are 18 to 24 months.
End of the engineer. Rise of the architect.
What’s preventing AI from taking on the role of an architect?
Good taste. I just vibe coded some garbage in an hour. But it is so bad I can't see how to get to good code from here and so I'll spend the rest of the week doing it by hand.
now to be fair this is my first attepmt at vibe coding and so I might not know how to prompt the ai
Depends on the tool, but the first thing to do is to use a plan mode where the AI will ask you follow up question to precise the project. This gives a much, much better result than just have the AI start to work with a few lines of prompt.
That basically turns your bad prompt into a good prompt, then execute on it.
Nothing. Most software "architecture" is simply repeating the same patterns, just like writing code, just at a slightly higher level of abstraction. There's a slight moat if you are good at the business domain, and can serve as a bridge, but honestly, AI is getting pretty good at that as well.
I wonder. I follow @DamiLeeArch on YouTube. She talks about architecture in the built environment. Ostensibly, built for Humans. At least for now.
Or a CEO?
[dead]