Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.
Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"
Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.
Second gen Pentiums, starting with the 75 MHz, were great.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
Ahhh but it gave me the opportunity to ran real programs, coming from an XT!
*Edited to add an example: I could for the first time use AutoCAD.
The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
The 486 and https://www.delorie.com/djgpp/history.html changed everything.
Suddenly, it was possible to imagine running advanced software on a PC, and not have to spend 25,000 USD on a workstation.
Amazing to see a webpage "Updated Dec 1998" still up, running and displaying correctly.
Hard to imagine now, but this was a huge turning point. A genuinely powerful CPU in a "Pee-Cee" available for less than RISC workstation money. I had to wait a while, mine was an AMD DX2-66 since I didn't have a budget for Intel... add Slackware... and countess hours messing with XF86config and I had a poor-mans Sun workstation.
The 486 killer app was DOOM. It was butter-smooth at 20 fps if you also had a VLB graphic card.
The 486 DX2 66MHz was the target platform for gaming during almost two years (1992-1994). That was an huge achievement back in the days to be at the top that long.
The DX/2 66 is a true legend of a chip. It was so good. The final nail in the coffin for the Amiga and for 68k. I love the Amiga, but it just didn’t Doom.
Before it, you could claim that a 68040 was kinda-sorta keeping up with the 486 and that the nicer design and better operating systems of other computers made up for the delta in raw performance, but the DX/2 66 running Doom was the final piece of proof that the worse-is-better approach of using raw CPU grunt to blast pixels at screen memory instead of relying on clever custom circuitry was winning.
Faced with overwhelming evidence, everyone sold their Amiga 1200s and jumped ship to that hated Wintel platform.
I remember arguments (and benchmarks) around all the variations of the 486 since the bus speed/clock speed was uncoupled (the /2 is clock doubling). For some applications, a 50Mhz 486 with a 50Mhz bus would beat a DX/2 66Mhz with a 33Mhz bus.
And sometimes the DX/4 100Mhz would be slowest of all those at 25Mhz bus.
Nearly correct. The DX/4 100MHz had a 33MHz bus. The DX/4 75MHz had the 25MHz bus. I remember well because I had both.
Now I remember being annoyed that it wasn't the DX/3 as it should have been!
Especially since when actual clock quadrupled chips eventually came out they had to call themselves ridiculous things like ”5x86” instead of DX/4. (The Am5x86 133 runs at 4x33 MHz)
As I noted in my other comment (1), in 1985 Amiga OCS bitplane graphics (separate each bits of a pixel index into separate areas) was a huge boon in 2d capability since it lowered bandwidth to 6/8ths but made 3d rendering a major pain in the ass.
The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
Reading in hindsight there was probably too many structural issues for Commodore to remain competitive anyhow, but an alt-history where they would've seen the needs for 3d rendering is tantalizing.
1: https://news.ycombinator.com/item?id=47717334
> The Aga chipset of the 1200/4000 stupidly only added 2 more bitplanes. The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
The intention was good, but the Akiko chip was functionally almost useless. It was soon surpassed by CPU chunky to planar algorithms. I don't think it was ever even used in any serious way by any released games (though it might have been used to help with FMV).
The CD32 chip actually had byte-per-pixel (chunky) graphics modes but the omission from the 1200 was fatal.
I agree. Unfortunately, even with chunky graphics and/or 3D foresight, 68k would still have been a dead end and Commodore would still have been mismanaged into death. It’s fun to dream though…
Was it necessarily a dead end? Considering the ways Intel and later AMD managed to upgrade/re-invent x86 that until x64 still retained so much of the x86 instruction encoding/heritage (heck, even x64 retains some of the instruction encoding characteristics).
Had the Amiga retained relevance for longer and without a push for PowerPC I don't see a reason why 68k wouldn't have been extended. Heck the FPGA Apollo 68080 would've matched end of 1990s P-II's and FPGA's aren't speed monsters to begin with.
There were no tech problems IMHO, it was all mgmt problems. They could have chosen a handful of completely different tech paths and still have won, but instead they chose to do almost nothing except bleeding the company dry.
Commodore so slowly and ineffectually improving on the OCS didn't help, but the original sin of the Amiga was committed in the beginning, with planar graphics (i.e., slow and hard to work with, even setting aside HAM) and TV-oriented resolutions/refresh rates (i.e., users needing to buy a "flicker fixer"). It's like they looked at one of the most important reasons for the PC and Mac's success—a gorgeous, rock-solid monochrome display—and said "Let's do exactly the opposite!"
Iirc interlaced display and 6 bitplanes were a compromise to allow color graphics in 1985 with the memory bandwidths available at the time.
If it's a sin or feature can of course be debated but I remember playing games on an Amiga in the early 90s and until Doom the graphics capabilities didn't look outdated.
By 1992 with AGA however I agree, flicker and planar graphics(with 8 bitplanes any total memory bandwidth gains were gone) was a downside/sin that should've been fixed to stay relevant.
Yep. 486DX/2 was when I started seriously looking at moving on from the Amiga. I wound up with a DX/4 100 sometime in 1994.
My classmate kept his Amiga 1200 a bit longer! ...eventually he got a PC with Pentium 60 MHz.
Yeah, there were holdouts of course but the DX/2 really seems like the breaking point.
(Also, a Pentium 60 is barely faster than a DX/2 66 at many tasks — it is a Bad Processor — but that’s another conversation ;)
Many tasks perhaps, but running Quake was not one of them.
Pentium is a bad processor? It's way faster than 486, especially on FP it's not even close.
The original Pentiums (socket 4, 60 or 66 MHz) had the infamous floating point division bug, had underwhelming perf for anything not FP bound (most things), ran hot, and were too expensive for what you got. A DX/4 100 was nearly always a more rational choice.
Second gen Pentiums, starting with the 75 MHz, were great.
The Pentium was great, but the 60 and 66MHz versions were not liked, they ran way too hot.
At that point in time I would not have called it Wintel yet. That started after Windows 95, IIRC.
Doom was released end of '93. In 1992 most of us were in the 286 -> 386 upgrade wave and a 486-33 was easily at $2.5k+ ($5.5k in today's terms). The 486 DX2 66 was a good choice even 1994-1996.
Yes, the latest chips were very expensive back then, and out of reach for most people who would continue buying new computers with older chips. (As opposed to how most people today buy an iPhone or a Mac or whatever with the latest semiconductor technology.) I got my 25MHz 386 in 1991, over two years after the 486 was announced, and I had one of the fastest computers of anybody in school... for a short time.
They need to bring back the turbo button.
You’re in luck!
https://www.silverstonetek.com/en/product/info/computer-chas...
How could I possibly forget the lock!
https://en.wikipedia.org/wiki/VESA_Local_Bus for the younger crowd.
I didn't have access to a 486 until around 1999. I was making do with a hand-me-down 8088 and then a 386SX.
Back then, 10 years of technological advancement made a huge difference. Today, you can get by just fine with a 2016-era laptop.
It comes over as so incredibly insane to me that people from the late 80s (people working with computers! Reporting on them!) would look at their current technology stack and basically go: "I have no idea whatsoever what else we can do with these things, we've reached the end"
The lack of imagination is just disturbing.
On the other end, you have people who have no idea how insanely fast computers are today, and how little computing power is "really" needed for most things that computer users do - or how much you can do with one average machine ("Oh no, 1000 requests per second - let's erect another rube goldberg machine to handle that!").
The 80s and 90s were filled with new things computers could do - spreadsheets, wysiwyg word processors, games - things that simply were impossible before (or not done).
In the 2000s through now we've mostly had improvements - 4k Youtube is much better than realplayer, but it's still just "online video". AI is definitely a "new" thing and it's somewhat awoken a similar spirit to the 80s/90s - but not the same breadth. Dad bringing home a computer because he wants to do spreadsheets and you finding it can run DooM or even play music.
The first 80286-based system (IBM PC AT), 80386 (Compaq Deskpro 386), and 80486 all had people writing about their suitability as servers, with the consensus's implication being that normal people didn't need them.
The Pentium is the first one, I think, that this didn't happen, because by then it turned out that people need a computer that can do what they are currently doing—but faster—much more often than they need servers.
Great throwback.. they were awesome proc's. With a few Simms (4 - 16 Mb) it could do multimedia madness never seen before (play a CD-ROM game of mpeg1 video) 486dx4 100 was the latest Intel I had before going to Pentium clones. (AMD K series and the shitty Cyrix 6x86)
We ran a 3-line BBS (Renegade and then Wildcat) on OS/2 on a 486-33 with 12 MB RAM. This was in 1994 or so. Great way to multitask several dos applications!
I've got one sitting on the shelf above my desk, a 33 Mhz dx, I don't even remember what machine it came out of.
I remember getting my first 486 33mhz computer and being able to play Ultima 7 the black gate, and later Ultima 7 part 2. This was a turning point for me as the game was way ahead of others on the console side of things. DOS 6 !
I think the turning-point was that flat-framebuffers and plenty of CPU-power for the first time eclipsed specialized 2d hardware (Amiga,Megadrive, Snes, etc).
Flat framebuffers and "powerful" CPU's also enabled easier software rendering (Doom/Duke) of 3d, compared to the Amiga where writing textured rendering for an Amiga is a PITA due to video memory layout with separted bitplanes spreading bits of each pixel into different memory locations (the total memory bandwidth reduction in 1985 by using 5 or 6 bitplanes became a fatal bottleneck at this point).
It wasn't really always full framerate though and the 2d chipsets did help in "classic" actiongames that were still much in the rage.
The Pentium further widened the gap, but at the same time consoles gained hardware 3d acceleration (PSX/Saturn/Jaguar) yet the Pentium could do graphics better in some respects (As shown with Quake).
Once 3d accelerators landed, PC's has more or less constantly been ahead apart from when it comes to price (and comfort/ease).
486 SX 33Mhz, could not afford the DX
My experience too, as I dimly remember it.
The 486 SX was a fine chip, just no math copro.
The 386 SX was a crap, 16 bit wide bus IIRC.
Ahhh but it gave me the opportunity to ran real programs, coming from an XT! *Edited to add an example: I could for the first time use AutoCAD. The price difference between a 286 and a 386SX was negligible, but the software I could use, was other league.
Yeah by the time we were getting into it the 486 was already out, but we wanted the real 32 bit bus and had to be a bit careful when looking at used computers (as by that time the 386SX and DX machines were about the same price).
How was the person incorrect that speed increases won't continue forever? Pentium 4 was 3.8GHz and Ryzen 7 has 4.7Ghz some 20 odd years later?
Clock speeds used to be going up in a straight line (the normal "interpretation" of Moore's law) - but once the P4 hit a (kind of useless 3.8GHz) we leveled off for decades.
More specifically, it was the end of Dennard scaling [0] that killed off the growth in clock speeds in the mid-2000s
[0] https://en.wikipedia.org/wiki/Dennard_scaling
Uuh! I recall i had this setup, not in 89, but sometime in the early 90s.
Played some awesome games, like DOOM, Wolfenstein. Later duke3d was the shit. But i cant remember if i run on the same setup or something newer.