Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how common were full game VMs in the 90s. For a game older than myself, wouldn't a VM layer incur a great performance penalty on PCs from that time?


It was far more important to have the same software work on Amiga, x86 (DOS), Mac and the whole slew of different machines than came and went.

Today we have fewer machines than the great explosive growth of the 80s.

Consider that most 'software' today is JavaScript interpreted by the Web Browser. It's not like those portability concerns didn't exist in the 80s, if anything, it was harder because you had to make your own interpreter back then.

---------

Many (maybe most?) video games seem to have been written in a VM, at least before Doom / high performance 3d graphics.

I think console games were in C/Assembly for performance.

But 'computer' games at that time was before the standard IBM PC or at least, before the PC won and Microsoft achieved dominance. When you didn't know if Amiga, PC98, IBM PC, Mac, or others would win it only made sense to write a VM.

SCUMM (Monkey Island and many others) comes to mind.


The Infocom text adventures (e.g. Zork) were based on a VM

https://en.wikipedia.org/wiki/Z-machine


Which is stll working with libre implementations (frotz) and OOP based compilers targeting the Z-machine (inform6 +inform6lib).


> Consider that most 'software' today is JavaScript interpreted by the Web Browser.

I thought most software was MS Excel sheets with interacting formulae :-)


And as it happens, early versions of Excel used a bytecode running on a VM instead of native code. Though the motivation was not portability, but rather memory requirements:

> In most cases, p-code can reduce the size of an executable file by about 40 percent. For example, the Windows Project Manager version 1.0 (resource files not included) shrinks from 932K (C/C++ 7.0 with size optimizations turned on) to 556K when p-code is employed.

> Until now, p-code has been a proprietary technology developed by the applications group at Microsoft and used on a variety of internal projects. The retail releases of Microsoft Excel, Word, PowerPoint®, and other applications employ this technology to provide extensive breadth of functionality without consuming inordinate amounts of memory.

http://sandsprite.com/vb-reversing/files/Microsoft%20P-Code%...


> Many (maybe most?) video games seem to have been written in a VM, at least before Doom / high performance 3d graphics.

This was not terribly common, for the obvious performance reasons. Another World ran at around 10-20FPS on most of the systems it was released for, which is fine for a methodical game like that (and for adventure games like Monkey Island, etc.) but doesn't work for fast action games.

And of course VM games were basically impossible for the entire 8-bit era, with the exception of things like Zork (and the rest of Infocom's Z-Machine games) whose performance needs were so small that the gigantic overhead of an 8-bit VM was hardly noticeable.

Even into the 16-bit era, the majority of multi-platform games were fully rewritten ports.


Yes, but you weren't doing things like _Elite_ in these sorts of special-purpose VMs. Aside from the portability issue (extremely important when platforms had the lifespan of mayflies back then as Moore's law blazed along full speed), VMs also got you compression. A full compiled binary might be exorbitantly expensive in disk/tape space, not to mention RAM. But a very small VM could make a custom-tailored language to interpret on the fly, and save a ton of space where you needed to sweat every kilobyte. (Think about the different in size between `print "Hello world!"` and the default compiled binary.) It didn't matter how fast your text adventure ran if it couldn't fit in X kb of space.


During the heyday of assembly language, VMs were common in business software as well. It made porting to different types of systems easier in a time when standards-compliant C-language compilers targeting a variety of systems did not yet exist or were very expensive.


Oh, I know about the emulation layers of early computers! But I'd assume those programs rarely required frame-perfect input unlike video-games. Wouldn't that be too wasteful and needlessly limit the playerbase?

Edit: after reading through wikipedia, I think maybe a VM wouldn't be that wasteful, since the game is very simple mechanically.


Consoles (and arcades) back then had far better graphical performance than computers. So good computer games didn't have frame perfect inputs at all ... Or at least, not good games (bad games like TMNT for PC / DOS did exist but we're horribly buggy and broken)

Computer games had explosive inputs available, like Civilization or needed the use of a mouse.

Not so much action / frame perfect stuff. Not until a bit later anyway. Eventually computers were fast enough for arcade ports but computer games just didn't really target that action niche.

------

The 'computers' with good graphics were like Amiga, not x86 based DOS with mode 13h graphics. So it was all the fallen / failed computers that had the decent action games IIRC.


The trick to these earlier VMs, from the Infocom Z-Machine and Wizardry's interpreted Pascal code, through SCUMM, Sierra AGI and SCI, Another World, the Horrorsoft games, etc., is that they recognized that the games they were making were primarily going to be "content-delivery mechanisms": lots of text and graphical assets, driven by relatively simple computations: the authoring constraint is only related to the hardware in terms of I/O and data compression. So the code that was being run by the interpreter was mostly run-once "initialize the scene" and then some animation timers.

The opposing idea is represented more by arcade gaming, and later, stuff like Doom and Quake: The game is relatively intimate with the hardware in what it simulates, while the kind of definition that makes up a scene is more on the order of "put a monster here and a health pickup there", which aligns it towards being map data, instead of scripted logic.


Depends on what you consider 'full game VM'. Adventure games from Infocom ran all game code on a VM, and so did the graphical adventures from Sierra and LucasArts. The latter two used some native graphics primitives of course.


Scumm is a good example. There was a port for the original DS that ran well enough.

https://wiki.scummvm.org/index.php/Nintendo_DS


Another World is on a whole other level. SCUMM is from '89 and the NDS came out in 2004. Another World game came out in 1991, and because it used the VM it could be back-ported to Apple IIGS (1986), the computer that's 5 years older than the game itself!

The graphics exclusively used real-time rendered polygons with support for transparency, which nobody knew was even possible at the time. Along with researching the new rendering tech, the same person created everything else except the music - the memorable & immersive world, an original story, concept and cover art, strong cinematics that were SoTA at time, graphics and animation, innovative level design, puzzles, the game logic - over just 2 years. It also defined a new 'cinematic platformer' genre, with later titles like Flashback, Blackthorne, Oddworld, and recent LUNARK. It's simply incredible feat.


real-time rendered polygons with support for transparency, which nobody knew was even possible at the time

Aegis Animator was doing pretty much the same sort of rendering on the Amiga, in 1985.

I never did much with it, what with being a kid at the time, but it was fun to play with and looked pretty cool. I don't think its rendering was as tightly optimized as Another World's was, though.


I don't know any more full game examples, but...

Earthbound (SNES, 1994) contains TWO complete scripting systems, one for the dialog system (which is occasionally used for things it shouldn't be; most of shop logic is in it), and one for scripting sprite movement. The dialog script is actually quite impressive and easy to use; I'd consider implementing a similar system even in a modern RPG. The sprite movement script is trash, significantly harder to work with than games that use raw assembly. Apparently that movement script system was actually a common in-house library at HAL, dating back to the NES era, but I don't know too much about that history.

Also most of the game's assembly was actually compiled from C, which was almost unheard of for console games at the time.


Well the idea of a vm wasnt to foreign if my reading of computing history is right. Consider that java launched in 1993 or 94 and its big claim to fame was its portability between systems, and that was because of the jvm, or java virtual machine.

I don’t think virtual machines and emulation are that new of a thing. Virtualizing x86 at full speed on consumer hardware has been a thing for, what, 15 to 20 years? And sure that requires special processor features, but remember that systems that came before that that would need to be emulated had even less computational demands. Iirc, a widely used pos software from the 80s has been running in emulation on pos hardware that far exceeds its requirements for the last 25 years, at least.

Also, my understanding is that lots of crucial government and business software runs on many layers of virtualization.

And my last recollection from what I’ve gathered is that, really until around the mid 90s a lot of operating systems made until then were pretty much hypervisors that ran programs that were virtual machines themselves. Multitasking was simply being able to route hardware resources to a given program, which was sorta its own environment.


Memory was often the constraint on low-end computers "back in the day", so code density was a reason to have a VM. This is why Wozniak shipped a VM in the Apple II's ROM.

https://archive.org/details/byte-magazine-1977-11/page/n147/...


VM already existed since PL/0 which is the prototype of Pascal. It is also known as P-code and to.be honest it is fine. Especially when you can leverage JIT which trades memory space for gains in speed.


If you mean bytecode as executable format, that originates already in the late 1950's, early 1980's, with microcoded CPUs as the interpreter, from which Burroughs Large Systems is one of the most famous ones.


Oh, Burroughs Large Systems. The one Edsger Dijkstra worked on. Can't believe they are the one before IBM to invent microcode.


"older than myself"

#rightinthefeels




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: