Almost all of Patrick's points are great if your software development goal is to make a buck. They don't seem to matter if you're writing open source, and I'd argue that desktop apps are still relevant and wonderful in the open source world. I just started a new hobby project, and am doing it as a cross-platform, non-Electron, desktop app because that's what I like to develop.
The onboarding funnel: Only a concern if you're trying to grow your user base and make sales.
Conversion: Only a concern if you're charging money.
Adwords: Only a concern if, in his words, you're trying to "trounce my competitors".
Support: If you're selling your software, you kind of have to support it. Minor concern for free and open source.
Piracy: Commercial software concern only.
Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.
The only point I can agree with him that makes web development better is the shorter development cycles. But I would argue that this is only a "developer convenience" and doesn't really matter to users (in fact, shorter development cycles can be worse for users as their software changes rapidly like quicksand out from under them.) To me, in my open source projects, my "development cycle" ends when I push to git, and that can be done as often as I want.
Going further, if you're a hobbyist, you're probably instinctively prioritizing the aspects of the hobby that you enjoy. My first app was a shareware offering in the 1980s, written in Turbo Pascal, that was easy to package and only had to run on one platform. Because expectations were low, my app looked just as good as commercial apps.
Today, even the minimal steps of creating a desktop app have lost their appeal, but I like showing how I solved a problem, so my "apps" are Jupyter notebooks.
I see a lot of this sentiment amongst developer friends but I never could relate. Its not that I'm against it or something but it just doesn't move me personally.
Most things I create in my free time are for my and my family's consumption and typically benefit immensely from the write once run everywhere nature of the web.
You can launch a small toy app on your intranet and run it from everywhere instantly. And typically these things are also much easier to interconnect.
> Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.
KDE has analytics, they're just disabled by default (and I always turn them on in the hopes of convincing KDE to switch the defaults to the ones I like).
For some things a desktop app is required (more system access) or offers some competitive UX advantage (although this reason is shrinking all the time). Short of that user's are going to choose web 95% of the time.
Counterpoint: is the web browser not already fulfilling the "universal app engine" need? It can already run on most end user devices where people do most other things. IoT/Edge devices don't count here, but this day most of their data is just being sent back to a server which is accessible via some web interface.
Ignoring the fragmentation of course; although that seems to be getting less and less each year (so long as you ignore Safari).
This reminds me of a past job working for an e-commerce company. This wasn’t a store like Amazon that “everyone” uses weekly, it was a specific pricey fashion brand. They had put out a shitty iOS app, which was just a very bare-bones wrapper around the website. But they raved about how much better the conversion rate rates were there. Nobody would listen to me about how the customers that bother downloading a specific app for shopping at a particular retailer are obviously just superfans so of course that self-selected group converts well.
So many people who should be smart based on their job titles and salaries, got the causation completely backwards!
In 2026, the number of mobile applications in the App Store and Google Play increased by 60% year over year, largely because entry into the market has become much easier thanks to AI.
No, "grass always looks greener on the other side" is a perspective thing. If you stand on your own grass then you look down onto it and see the dirt, but if you look over to the other side you see the gras from the side which makes it look more dense and hides the dirt. But it's the same boring grass everywhere. :)
Nothing in this article is wrong, but worth noting that pre-AI, the companies that most significantly transformed the way we use our computers (Slack, Spotify, VS Code, etc.) did ship desktop apps.
“Desktop Apps”? I’d say pre-Electron, the ones that existed that far back shipped desktop apps, but for the past 10-15 years it’s all been Electron slop, which hardly qualify as “desktop apps” in my book.
If anything, it’s my very faint hope that AI would give companies - especially non-software companies - the bandwidth to release two real native apps instead of just 2 builds of a shitty Electron app. Fat chance though, I think, not least because companies love to use their “bRaNdInG” on everything - so the native look and feel a real app gives you “for free” is a downside for the clowns that do the visual design for most companies.
For what it’s worth, I tried making a GTK4 app. I got started, created a window, created a header bar, then went to add a url/path entry widget and everything fell apart.
Entry suggestions/completions are formally deprecated with no replacement since 2022. When I did get them working on the deprecated API there was an empty completion option that would segfault if clicked. The default behaviour didn’t hide completions on window unfocus, so my completions would hover over any other open window. There was seemingly no way to disambiguate tab vs enter events… it just sucked.
So after adding one widget I abandoned the project. It felt like the early releases of SwiftUI where you could add a list view but then would run into weird issues as soon as you tried adding stuff to it.
Similarly trying to build an app for macOS practically depends on Swift by Sundell Hacking with Swift or others to make up for Apple’s lack of documentation in many areas. For years stuff like NSColor vs Color and similar API boundaries added friction, and the native macOS SwiftUI components just never felt normal while I tried making apps.
As heavy as web libraries and electron are, at least work mostly out of the box.
All of those examples are web apps, two of them started on the web itself, and none of them transformed anything about how we used our computers (slack replaced a number of competitors, spotify is iTunes for the web, and VS code is a smaller jetbrains)
Not off to a great start... The "look how many steps it takes to convert shareware users" is insanely overblown.
1-4. Google, find, read... this is the same for web apps.
2. Click download and wait a few seconds. Not enough time to give up because native apps are small. Heavy JS web apps might load for longer than that.
3. Click on the executable that the browser pops up in front of you. No closing the browser or looking for your downloads folder. It's right there!
3.5. You probably don't need an installer and it definitely doesn't need a multi-step wizard. Maybe a big "install" button with a smaller "advanced options".
3.6. Your installer (if you even have it) autostarts the program after finishing
4. The user uses it and is happy.
5. Some time later, the program prompts the user to pay, potentially taking them directly onto the payment form either in-app or by opening it in a browser.
6. They enter their details and pay.
That's one step more than a web app, but also a much bigger chance the user will come back to pay (you can literally send them a popup, you're a native app!).
> However, the existence of pirates is a stitch in my craw, particularly when any schoolmarm typing the name of my software into Google is prompted to try stealing it instead:
I wonder whether Google, in its Don't Be Evil era, ever considered what they should do about software piracy, and what they decided.
I'd guess they would've decided to either discourage piracy, or at least not encourage it.
In the screenshot, the Google search query doesn't say anything about wanting to pirate, yet Google is suggesting piracy, a la entrapment.
(Though other history about that user may suggest a software piracy tendency, but still, Google knows what piracy seeking looks like, and they special-case all sorts of other topics.)
Is the ethics practice to wait to be sued or told by a regulator to stop doing something?
Or maybe they anticipate costs and competition for how they operate, and lobby for the regulation they want, so all they have to do is be compliant with it, and be let off the hook for lawsuits?
The original expression came out of an internal company discussion that someone summarized (paraphrased) as "when there's a tough choice to make, one is usually less evil. Make that choice."
In the early days of Google in the public consciousness, this turned into "you can make money without being evil." (From the 2004 S-1.)
Over time, it got shortened to "don't be evil." But this phrase became an obligatory catchphrase for anyone's gripes against Google The Megacorp. Hey, Google, how come there's no dark mode on this page? Whatever happened to "don't be evil"? It didn't serve its purpose anymore, so it was dropped.
Answering your question really depends on your priors. I could see someone honestly believing Google was never in that era, or that it has always been from the start. I strongly believe that the original (and today admittedly stale) sentiment has never changed.
Making a loud affair out its retirement rather than quietly letting it collect dust and be forgotten over time was most definitely not a good idea.
The public already demonstrated that they adopted, misused and weaponized the maxim. Its retirement just sharpened the edge of that weapon. Now instead of "What happened to don't be evil?" it's become "Of course Google is being evil." and everything exists in that lens.
A similar dynamic is playing out with Anthropic, whose founders left OpenAI in part over a philosophical split that could be described, if you'll grant a little literary license appropriate to this thread, as Anthropic choosing the "don't be evil" path. No surprise that we now see HN commentary skewering Anthropic for not living up to it.
Early in Google's history, I took that sentiment as saying that they were one of us (Internet people), and weren't going to act like Microsoft (at the time, regarded by Internet people as an underhanded and ignorant company). Even though they had a very nice IR function and general cluefulness, and seemed destined to be big and powerful.
And if it were the altruistic Internet people they hired, the slogan/mantra could be seen as a reminder to check your ego/ambition/enthusiasm, as well as a shorthand for communicating when you were doing that, and that would be respected by everyone because it had been blessed from the top as a Prime Directive.
Today, if a tech company says they aspire not to be evil: (1) they almost certainly don't mean it, in the current culture and investment environment, or they wouldn't have gotten money from VCs (who invest in people motivated like themselves); (2) most of their hires won't believe it, except perhaps new grads who probably haven't thought much about it; and (3) nobody will follow through on it (e.g., witness how almost all OpenAI employees literally signed to enable the big-money finance-bro coup of supposedly a public interest non-profit).
They had to at least nominally have it, early on, to be able to hire the best Internet-savvy people.
Tech industry culture today is pretty much finance bro culture, plus a couple decades of domain-specific conditioning for abuse.
But at the time Google started, even the newly-arrived gold rush people didn't think like that.
And the more experienced people often had been brought up in altruistic Internet culture: they wanted to bring the goodness to everyone, and were aware of some abuse threats by extrapolating from non-Internet society.
Realizing I could frickin mine enough bitcoins overnight back then to probably be set for life (maybe for multiple generations) now, is one of my biggest life regrets. I assume it’s shared with all other people who were into tech back then but dismissed bitcoin as stupid, as I did.
You simply can't get hung up on what could have been. Same applies to trying to time the stock market... should have bought, should have sold. Best thing is to know there's nothing that can be done about the past and move along and deal with what you can do now instead.
You're right. What gets me though is that unlike the stock market, bitcoin was an incredibly rare occurrence where anyone could have gotten extraordinarily rich without even incurring any risk! (besides a couple evenings spent learning how to use it.) Whereas to have $10MM today in GOOG stock, I would have had to invest over $300k in 2010.
I’m actually hopping on the desktop applications train. Though not for money. I just think the browser is becoming a surveillance plague of computing and we need MORE high quality desktop software not built on the invasive web stack to counter it.
On the other hand I spent 25 years selling desktop software and never once had an annual review. I never had to submit an application for time off. I never had to ask permission for a dentist appointment. If the weather was good I could take the day off and go for a bike ride. I didn’t attend any scrum meetings nor did I have to argue about what framework to use with a PM who couldn’t code FizzBuzz.
First up, this article is 17 years old. There's no reason to assume the author has exactly the same opinions today.
More importantly, the author is talking about the realities of trying to earn a decent living shipping independent software. That requires paying customers.
It's perfectly reasonable to want to be paid for your work, and it certainly doesn't warrant the vitriol in your comment.
Almost all of Patrick's points are great if your software development goal is to make a buck. They don't seem to matter if you're writing open source, and I'd argue that desktop apps are still relevant and wonderful in the open source world. I just started a new hobby project, and am doing it as a cross-platform, non-Electron, desktop app because that's what I like to develop.
The onboarding funnel: Only a concern if you're trying to grow your user base and make sales.
Conversion: Only a concern if you're charging money.
Adwords: Only a concern if, in his words, you're trying to "trounce my competitors".
Support: If you're selling your software, you kind of have to support it. Minor concern for free and open source.
Piracy: Commercial software concern only.
Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.
The only point I can agree with him that makes web development better is the shorter development cycles. But I would argue that this is only a "developer convenience" and doesn't really matter to users (in fact, shorter development cycles can be worse for users as their software changes rapidly like quicksand out from under them.) To me, in my open source projects, my "development cycle" ends when I push to git, and that can be done as often as I want.
Going further, if you're a hobbyist, you're probably instinctively prioritizing the aspects of the hobby that you enjoy. My first app was a shareware offering in the 1980s, written in Turbo Pascal, that was easy to package and only had to run on one platform. Because expectations were low, my app looked just as good as commercial apps.
Today, even the minimal steps of creating a desktop app have lost their appeal, but I like showing how I solved a problem, so my "apps" are Jupyter notebooks.
My coworker showed a Jupyter notebook with ipywidgets and it looked just like an app. A good CLI using FastAPI's `typer` looks a lot like an app too.
I see a lot of this sentiment amongst developer friends but I never could relate. Its not that I'm against it or something but it just doesn't move me personally.
Most things I create in my free time are for my and my family's consumption and typically benefit immensely from the write once run everywhere nature of the web.
You can launch a small toy app on your intranet and run it from everywhere instantly. And typically these things are also much easier to interconnect.
To be fair, probably most of us here on HN write software to put food on the table. Don’t pooh-pooh our careers.
> Analytics and Per-user behavior: Again, only commercial software seems to feel the need to spy on users and use them as A/B testing guinea pigs.
KDE has analytics, they're just disabled by default (and I always turn them on in the hopes of convincing KDE to switch the defaults to the ones I like).
its just waaaaaay easier to distribute a web app
For some things a desktop app is required (more system access) or offers some competitive UX advantage (although this reason is shrinking all the time). Short of that user's are going to choose web 95% of the time.
This points to our failure as an industry to design a universal app engine that isn't a browser.
Counterpoint: is the web browser not already fulfilling the "universal app engine" need? It can already run on most end user devices where people do most other things. IoT/Edge devices don't count here, but this day most of their data is just being sent back to a server which is accessible via some web interface.
Ignoring the fragmentation of course; although that seems to be getting less and less each year (so long as you ignore Safari).
No. We did, it is the browser.
I wonder what the numbers say about desktop applications now, and how much the arrival of Electron changed things up here.
Nowadays, it seems to be that mobile apps have the "best metrics" for b2c software. I'd be interested to read a contemporary version of this article.
“Metrics”
This reminds me of a past job working for an e-commerce company. This wasn’t a store like Amazon that “everyone” uses weekly, it was a specific pricey fashion brand. They had put out a shitty iOS app, which was just a very bare-bones wrapper around the website. But they raved about how much better the conversion rate rates were there. Nobody would listen to me about how the customers that bother downloading a specific app for shopping at a particular retailer are obviously just superfans so of course that self-selected group converts well.
So many people who should be smart based on their job titles and salaries, got the causation completely backwards!
Some of us are still making a living from desktop apps, 17 years later.
In 2026, the number of mobile applications in the App Store and Google Play increased by 60% year over year, largely because entry into the market has become much easier thanks to AI.
What 'best metrics'?
I think in this case it can be approximated as 'largest market'
I'd wager there are more people paying for software for their smart phone than any other platform they use.
Anecdotally, conversion - from free to trial, trial to paid, one-off purchases, etc.
Electron is the worst of both worlds. I have never paid for an Electron app, and never will. Horrid UX.
> I have never paid for an Electron app
Your employer most likely has.
Sure, and so has my government. But I can only control what I personally pay for.
Grass always looks greener on the other side, mainly because it's been fertilised.
No, "grass always looks greener on the other side" is a perspective thing. If you stand on your own grass then you look down onto it and see the dirt, but if you look over to the other side you see the gras from the side which makes it look more dense and hides the dirt. But it's the same boring grass everywhere. :)
I preferred GPs poop joke version but to each their own.
This is from 2009, and the title should say so.
Nothing in this article is wrong, but worth noting that pre-AI, the companies that most significantly transformed the way we use our computers (Slack, Spotify, VS Code, etc.) did ship desktop apps.
“Desktop Apps”? I’d say pre-Electron, the ones that existed that far back shipped desktop apps, but for the past 10-15 years it’s all been Electron slop, which hardly qualify as “desktop apps” in my book.
If anything, it’s my very faint hope that AI would give companies - especially non-software companies - the bandwidth to release two real native apps instead of just 2 builds of a shitty Electron app. Fat chance though, I think, not least because companies love to use their “bRaNdInG” on everything - so the native look and feel a real app gives you “for free” is a downside for the clowns that do the visual design for most companies.
For what it’s worth, I tried making a GTK4 app. I got started, created a window, created a header bar, then went to add a url/path entry widget and everything fell apart.
Entry suggestions/completions are formally deprecated with no replacement since 2022. When I did get them working on the deprecated API there was an empty completion option that would segfault if clicked. The default behaviour didn’t hide completions on window unfocus, so my completions would hover over any other open window. There was seemingly no way to disambiguate tab vs enter events… it just sucked.
So after adding one widget I abandoned the project. It felt like the early releases of SwiftUI where you could add a list view but then would run into weird issues as soon as you tried adding stuff to it.
Similarly trying to build an app for macOS practically depends on Swift by Sundell Hacking with Swift or others to make up for Apple’s lack of documentation in many areas. For years stuff like NSColor vs Color and similar API boundaries added friction, and the native macOS SwiftUI components just never felt normal while I tried making apps.
As heavy as web libraries and electron are, at least work mostly out of the box.
All of those examples are web apps, two of them started on the web itself, and none of them transformed anything about how we used our computers (slack replaced a number of competitors, spotify is iTunes for the web, and VS code is a smaller jetbrains)
Not off to a great start... The "look how many steps it takes to convert shareware users" is insanely overblown.
1-4. Google, find, read... this is the same for web apps. 2. Click download and wait a few seconds. Not enough time to give up because native apps are small. Heavy JS web apps might load for longer than that. 3. Click on the executable that the browser pops up in front of you. No closing the browser or looking for your downloads folder. It's right there! 3.5. You probably don't need an installer and it definitely doesn't need a multi-step wizard. Maybe a big "install" button with a smaller "advanced options". 3.6. Your installer (if you even have it) autostarts the program after finishing 4. The user uses it and is happy. 5. Some time later, the program prompts the user to pay, potentially taking them directly onto the payment form either in-app or by opening it in a browser. 6. They enter their details and pay.
That's one step more than a web app, but also a much bigger chance the user will come back to pay (you can literally send them a popup, you're a native app!).
If my failing memory serves, those were valid concerns in 2009, when this was written.
> However, the existence of pirates is a stitch in my craw, particularly when any schoolmarm typing the name of my software into Google is prompted to try stealing it instead:
I wonder whether Google, in its Don't Be Evil era, ever considered what they should do about software piracy, and what they decided.
I'd guess they would've decided to either discourage piracy, or at least not encourage it.
In the screenshot, the Google search query doesn't say anything about wanting to pirate, yet Google is suggesting piracy, a la entrapment.
(Though other history about that user may suggest a software piracy tendency, but still, Google knows what piracy seeking looks like, and they special-case all sorts of other topics.)
Is the ethics practice to wait to be sued or told by a regulator to stop doing something?
Or maybe they anticipate costs and competition for how they operate, and lobby for the regulation they want, so all they have to do is be compliant with it, and be let off the hook for lawsuits?
Did Google ever have a real Don't be Evil era?
The original expression came out of an internal company discussion that someone summarized (paraphrased) as "when there's a tough choice to make, one is usually less evil. Make that choice."
In the early days of Google in the public consciousness, this turned into "you can make money without being evil." (From the 2004 S-1.)
Over time, it got shortened to "don't be evil." But this phrase became an obligatory catchphrase for anyone's gripes against Google The Megacorp. Hey, Google, how come there's no dark mode on this page? Whatever happened to "don't be evil"? It didn't serve its purpose anymore, so it was dropped.
Answering your question really depends on your priors. I could see someone honestly believing Google was never in that era, or that it has always been from the start. I strongly believe that the original (and today admittedly stale) sentiment has never changed.
Making a loud affair out its retirement rather than quietly letting it collect dust and be forgotten over time was most definitely not a good idea.
The public already demonstrated that they adopted, misused and weaponized the maxim. Its retirement just sharpened the edge of that weapon. Now instead of "What happened to don't be evil?" it's become "Of course Google is being evil." and everything exists in that lens.
A similar dynamic is playing out with Anthropic, whose founders left OpenAI in part over a philosophical split that could be described, if you'll grant a little literary license appropriate to this thread, as Anthropic choosing the "don't be evil" path. No surprise that we now see HN commentary skewering Anthropic for not living up to it.
If you need to sloganize a reminder to yourself to not be evil, that's not a promising sign
Early in Google's history, I took that sentiment as saying that they were one of us (Internet people), and weren't going to act like Microsoft (at the time, regarded by Internet people as an underhanded and ignorant company). Even though they had a very nice IR function and general cluefulness, and seemed destined to be big and powerful.
And if it were the altruistic Internet people they hired, the slogan/mantra could be seen as a reminder to check your ego/ambition/enthusiasm, as well as a shorthand for communicating when you were doing that, and that would be respected by everyone because it had been blessed from the top as a Prime Directive.
Today, if a tech company says they aspire not to be evil: (1) they almost certainly don't mean it, in the current culture and investment environment, or they wouldn't have gotten money from VCs (who invest in people motivated like themselves); (2) most of their hires won't believe it, except perhaps new grads who probably haven't thought much about it; and (3) nobody will follow through on it (e.g., witness how almost all OpenAI employees literally signed to enable the big-money finance-bro coup of supposedly a public interest non-profit).
In other words the company made a bet on peoples naivety and it worked.
They had to at least nominally have it, early on, to be able to hire the best Internet-savvy people.
Tech industry culture today is pretty much finance bro culture, plus a couple decades of domain-specific conditioning for abuse.
But at the time Google started, even the newly-arrived gold rush people didn't think like that.
And the more experienced people often had been brought up in altruistic Internet culture: they wanted to bring the goodness to everyone, and were aware of some abuse threats by extrapolating from non-Internet society.
'99 to 2004. You had to have been there, maaaan...
I've been there when Google was altavista.digital.com ;)
[2009]
Over a decade of circular "web apps are better for the subset of problems webapps are good at" tautologies.
I was curious why AI wasn't mentioned. Then I noticed the date: 2009.
And, I also I think many of the mobile and web apps will end up in prompting in the next few years.
I would like to go back to 2009 =) The world was definitely simpler, and Bitcoin was cheaper =)
Please pick up a few bitcoins for me too when you go there
Realizing I could frickin mine enough bitcoins overnight back then to probably be set for life (maybe for multiple generations) now, is one of my biggest life regrets. I assume it’s shared with all other people who were into tech back then but dismissed bitcoin as stupid, as I did.
You simply can't get hung up on what could have been. Same applies to trying to time the stock market... should have bought, should have sold. Best thing is to know there's nothing that can be done about the past and move along and deal with what you can do now instead.
You're right. What gets me though is that unlike the stock market, bitcoin was an incredibly rare occurrence where anyone could have gotten extraordinarily rich without even incurring any risk! (besides a couple evenings spent learning how to use it.) Whereas to have $10MM today in GOOG stock, I would have had to invest over $300k in 2010.
I put my compute in those days to help do some kind of protein folding simulation, definitely should of been bitcoin.
So true, no real SaaS, no heavy cloud infrastructures
I’m actually hopping on the desktop applications train. Though not for money. I just think the browser is becoming a surveillance plague of computing and we need MORE high quality desktop software not built on the invasive web stack to counter it.
> Web Applications Convert Better
ok, now do this analysis for mobile apps...
which circle are we in?
Condemned to useless labor, I beleive that's the 4th circle.
> Why I'm Done Making Desktop Applications
To save you a click: It's harder to monetize desktop apps than webapps.
Lol. LMAO, even.
Didn't HN have a "no clickbait titles" rule?
it's amazing how freeing working an office job is. my personal projects don't have concerns such as monetization.
On the other hand I spent 25 years selling desktop software and never once had an annual review. I never had to submit an application for time off. I never had to ask permission for a dentist appointment. If the weather was good I could take the day off and go for a bike ride. I didn’t attend any scrum meetings nor did I have to argue about what framework to use with a PM who couldn’t code FizzBuzz.
Great, good riddance. Hopefully open source and/or AI push this person out of developing entirely.
People who focus this much on "conversion" et al are dinosaurs who deserve extinction.
First up, this article is 17 years old. There's no reason to assume the author has exactly the same opinions today.
More importantly, the author is talking about the realities of trying to earn a decent living shipping independent software. That requires paying customers.
It's perfectly reasonable to want to be paid for your work, and it certainly doesn't warrant the vitriol in your comment.