Image Map Image Map
Page 12 of 16 FirstFirst ... 28910111213141516 LastLast
Results 111 to 120 of 156

Thread: A new article about x86 processors

  1. #111

    Default

    Quote Originally Posted by Chuck(G) View Post
    Even Microsoft was wary of the .COM file. Take a standard .EXE file and rename it to .COM on, say, MSDOS 5. You'd expect that DOS would just read it into memory and execute it as-is. That doesn't happen--there's a check in MSDOS for the "MZ" signature and the file is loaded and executed as a regular .EXE file.
    I don't know if I'd interpret that as a specific precaution "against" the .COM format; it also works the other way around - rename a .COM to .EXE and run it, DOS will handle it no-questions-asked. .COM has its obvious disadvantages, but in the context of an environment like MS-DOS where any code owns the system no matter how it's linked, I can't see much of a reason to be more wary of one format than another.
    int10h.org :: :: :: blog

  2. #112
    Join Date
    Jan 2007
    Location
    Pacific Northwest, USA
    Posts
    31,462
    Blog Entries
    20

    Default

    Yes, they're equally insecure, although .EXE has several advantages that I can think of right off the top of my head:

    (1) "Scatter" loading--unused or BSS areas needn't be carried along in the file.
    (2) Metadata
    (3) The ability to carry additional payload (without loading it), such as overlays and other data.

    .COM just hails back to a more primitive time.
    Last edited by Chuck(G); December 16th, 2018 at 10:56 AM.

  3. #113

    Default

    Quote Originally Posted by Chuck(G) View Post
    In normal code, what's the point? Aside from machine state transitions between privileged and user mode (or when servicing interrupts), why is this such a sticking point to have a user-accessible condition code register?
    Because 68k changes almost all flags after almost every instruction. It even sets carry and OF after every MOVE even if from memory to memory. Sometime it is useful to save arithmetic flags and later restore it. x86 has PUSHF/POPF, LAHF/SAHF. I cannot find any practical reason of making MOVE from SR privileged.
    I can't understand the point about BSS with COM-file. BSS doesn't need to be kept in COM file. COM-format also provided the easiest way to write assembly codes, for example, you can just save data from a debugger and then run it.

  4. #114
    Join Date
    Jan 2007
    Location
    Pacific Northwest, USA
    Posts
    31,462
    Blog Entries
    20

    Default

    You can't "scatter-load" COM files. So if your data-to-address space distribution is sparse, you've got to include all of it in a COM file. Really, a .COM file is an outdated notion, best suited to ROMs and boot loaders. There's no reason not to include metadata in an executable--and the COM file prevents that.

    As far as not being able to store condition codes (not the status register, which has considerably more information not useful to a user-mode program) as a word in user mode, I don't see the benefit--and I've used a fair number of ISAs in my time--with and without condition codes. If you absolutely have to save and restore condition codes, there are ways to do it without special instructions.

  5. #115

    Default

    Indeed, the COM-format lacks a lot of features of more complex formats but its presence doesn't deny other formats. If you need a sparse file or some other more complex format feature you can just use the EXE-format. IMHO the COM-format is just a container for a binary scripting language. We have textual scripts and why should we miss binary ones?
    68000 allows to read arithmetic flags by MOVE from SR but later 68k don't do it in the user mode. IMHO it was not a really big problem but rather a bit irritating by its poor substantiation.

  6. #116
    Join Date
    Jan 2007
    Location
    Pacific Northwest, USA
    Posts
    31,462
    Blog Entries
    20

    Default

    It didn't seem to hurt Apple...

    Why not .HEX and S-format files as well? Even plain old 16-bit .EXE went the way of the dodo in today's 32/64 bit world.

  7. #117

    Default

    Quote Originally Posted by lowen View Post
    Sorry, should have been comp.os.cpm instead. Google Groups is the best way of reaching and searching old Usenet archives. Here's one example: https://groups.google.com/forum/#!se...w/kZwD7TNxhz4J
    I have checked this archive and found almost nothing about Zilog history. So I am still curious why was F.Faggin away from Z8000 development and why did he abandon Z80 modernization?

  8. #118
    Join Date
    Jan 2014
    Location
    Western North Carolina, USA
    Posts
    1,177

    Default

    Quote Originally Posted by vol.litwr View Post
    I have checked this archive and found almost nothing about Zilog history. So I am still curious why was F.Faggin away from Z8000 development and why did he abandon Z80 modernization?
    The discussion I linked to was more about missteps with Z800/Z280 than about Z8000 versus super-Z80. Faggin did some interviews over the years, but I haven't found one that specifically addresses this. One of his interviews can be found at https://ethw.org/Oral-History:Federico_Faggin, which includes the audio so you can hear his answers as well as read them.

    My gut feel, given all the forward progress that was going on in the late 1970's, is that backwards compatibility just simply wasn't that important to companies at the time. Neither intel nor Motorola considered binary backwards compatibility as a key feature in building their 16 bit chips, so why should Zilog?

    Of course, users had different ideas. Scott mentioned above that he didn't really understand nostalgia for the Z280, and I've been meaning to reply to that point, but my interest at least is to finally have what was promised all those years ago. Sure, there are better processors today, but having the best processor has never been the point of nostalgia.

    The type of nostalgia I'm talking about is more like someone who has a '56 Corvette but wants to drop a Mystery Motor in it, giving the real promise of the 'vette several model years earlier than the 427's introduction as the L71 in the '67 'vette (yeah, I know about the Z11, but that beast was only in one model of the '63 Impala....), and besides all that, the first-gen 'vette gave a completely different driving experience than C2. (My dad rebuilt 427s for drag racers, so I know just a wee bit about them.... although my dad's favorite was the 327 small-block...)

    It's also like those who retrobrew second-gen hemis into Dodge cars for which the 426 was never available from the factory... that's the essence of retrobrewing, which is a different type of nostalgia than just simple interest in vintage gear. It's more of a "here's what I would have done then while knowing what I know now" than the "let's do the same thing all over again" of pure vintage nostalgia.
    Last edited by lowen; December 29th, 2018 at 08:12 AM.
    --
    Bughlt: Sckmud
    Shut her down Scotty, she's sucking mud again!

  9. #119

    Default

    Quote Originally Posted by lowen View Post
    The discussion I linked to was more about missteps with Z800/Z280 than about Z8000 versus super-Z80. Faggin did some interviews over the years, but I haven't found one that specifically addresses this. One of his interviews can be found at https://ethw.org/Oral-History:Federico_Faggin, which includes the audio so you can hear his answers as well as read them.

    My gut feel, given all the forward progress that was going on in the late 1970's, is that backwards compatibility just simply wasn't that important to companies at the time. Neither intel nor Motorola considered binary backwards compatibility as a key feature in building their 16 bit chips, so why should Zilog?
    Thank you but, indeed, I have read this oral history. It contains almost nothing about Z8000 or Z800. And, sorry, I know little about cars. I am just curious about the computer history. It is not exactly a kind of nostalgia because I never was much fond of Z80. IMHO 6502, 8085, 6809 and maybe T11 were better. However Z80 showed itself as quite a decent chip. So I am just curious what was wrong exactly with Z8000 and Z800/Z280? Why was Z8000 used so rarely? Was it too expensive, too slow, too buggy? Why did Z800/Z280 appear so late? Was it too slow, ...? It looks strange that we still don't have enough information for these topics. Of course, modern processors are good but they could be better.

  10. #120
    Join Date
    Jan 2007
    Location
    Pacific Northwest, USA
    Posts
    31,462
    Blog Entries
    20

    Default

    While going through old journals headed for recycling, I stumbled upon an issue of IEEE Micro magazine that benchmarks the then-new 32-bit CPUs (Intel, NSC, Motorola, Zilog...) Anyone interested in the results?

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •