Hacker News
Let's compile Quake like it's 1997
boznz
|next
[-]
I have always over specified the micro-controllers a little from that point, and kept a copy of the original dev environment, luckily all my projects are now EOL as I am retired.
travoc
|root
|parent
|previous
[-]
I doubt that everything you ever worked on is end-of-life. Some of it is still out there...
boznz
|root
|parent
|next
[-]
direwolf20
|next
|previous
[-]
Casey Muratori would point out the debugger ran faster on hardware from the era than modern versions run on today's hardware, though I don't have a link to the side–by–side video comparison.
Edit: Casey Muratori showing off the speed of visual studio 6 on a Pentium something after ranting about it: Jump to 36:08 in https://youtu.be/GC-0tCy4P1U — earlier section of the video is how it is today (or when the video was made)
clarity_hacker
|next
|previous
[-]
kelnos
|root
|parent
|next
[-]
In this case there was: the reason you need to reinstall to go from uniprocessor to SMP was because NT shipped with two HALs (Hardware Abstarction Layer): one supporting just a single processor, and one supporting more than one.
The SMP one had all the code for things like CPU synchronization and interrupt routing, while the UP one did not.
If they'd packed everything into one HAL, single-processor systems would have to take the performance hit of all the synchronization code even though it wasn't necessary. Memory usage would be higher too. I expect that you probably could run the SMP HAL on a UP system (unless Microsoft put extra code in to make it not let you), but you wouldn't really want to do that, as it would be slower and require more RAM.
So it wasn't that those abstraction layers didn't exist back then. It was that abstraction layers can be expensive. This is still true today, of course, but we have the cycles and memory to spare, more or less, which was very much not the case then.
Sesse__
|root
|parent
|next
[-]
Linux also used to be like this, but these days has unified MP/UP kernels; on single-CPU systems (or if you give nosmp), the extra code is patched away at boot time. It wouldn't have been an unheard of technique at the time.
kccqzy
|root
|parent
[-]
amluto
|root
|parent
|next
|previous
[-]
CDs were around and hard drives weren’t that small at the time. (Or maybe the really early SMP versions predated widespread availability of CD-ROMs, but I remember dealing with this nonsense and reinstalling from an MSDN CD set.)
bluedino
|next
|previous
[-]
Maro
|next
|previous
[-]
knorker
|next
|previous
[-]
Is that accurate? I thought DJGPP only ran on and for PC compatible x86. ID had Alpha for things like running qbps and light and vis (these took for--ever to run, so the alpha SMP was really useful), but for building the actual DOS binaries, surely this was DJGPP on x86 PC?
Was DJGPP able to run on Alpha for cross compilation? I'm skeptical, but I could be wrong.
Edit: Actually it looks like you could. But did they? https://www.delorie.com/djgpp/v2faq/faq22_9.html
jeffrallen
|next
|previous
[-]
I used it in the mid-90's and yes, it was eye opening. On the other hand, I was an Emacs user in uni, and by studying a bit the history of Emacs (especially Lucid Emacs) I came to understand that the concepts in Visual Studio were nothing new.
On the third hand, I hated customizing Emacs, which did not have "batteries included" for things like "jump to definition", not to mention a package manager. So the only times in the late-90s I got all the power of modern IDEs was when I was doing something that needed Windows and Visual Studio.
webdevver
|previous
[-]
there was another article where someone bootstrapped the very first version of gcc that had the i386 backend added to it, and it turns out there was a bug in the codegen. I'll try to find it...
EDIT: Found in, infact there was a HN discussion about an article referencing the original article: