Image Map Image Map
Page 1 of 4 1234 LastLast
Results 1 to 10 of 32

Thread: 8259A PICs in PC, XT and AT

  1. #1
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default 8259A PICs in PC, XT and AT

    I have recently been toying around with the auto-EOI mode of the 8259A, to try and save a few bytes/cycles for time-critical timer interrupt stuff.
    I got it working on my IBM 5160.
    But then I got a 286 machine, and well... the plot thickened.

    Namely, I wanted to reset the PIC back to its original state when my program exited. So I took the code from the IBM PC BIOS and put that in my application.
    Since an AT-class machine has two PICs, it would need to be restored to the cascaded setup instead (one of those rare occasions where you have to know exactly what hardware you're running on, because you cannot assume backward compatibility).
    So I added some code to detect the second PIC, and use the code from the AT BIOS in that case.

    So far so good, the detection worked, and I got it working on both my 5160 and my 486 clone.
    But when I tried to run it on my 286, it didn't work. I found that if I keep sending manual EOIs, the system appears to work correctly, but for some reason it didn't switch into auto-EOI mode.

    So instead of just setting the first PIC up in a PC/XT configuration and ignoring the second PIC, I instead took the full AT BIOS setup, and just changed the auto-EOI bit there. Then it magically started working on my 286.
    After some troubleshooting I found out that running a single PIC was not the issue. The problem was that the PC/XT sets ICW4 to a value of 9, where the AT sets it to a value of 1. The difference here is in the 'buffered mode'.
    For some reason, auto-EOI doesn't work on my 286 if I set buffered mode.

    So, perhaps someone here can shed some light on this, eg:
    - Why does the AT initialize the 8259As with a different setting for ICW4? Is this because of changes in the circuit when adding the second PIC?
    - What exactly is the effect of enabling/disabling buffered mode?
    - How can auto-EOI be affected by the buffered mode setting? It doesn't seem to make sense to me. Auto-EOI is something that the 8259A should do internally, right? So how could it be affected by the buffered mode setting, which controls a signal to the outside world?

    My 286 is a late model, which has an integrated chipset, so there are no physical 8259As present (and they probably integrated the whole two-PIC circuit in a single chip). It is possible that the implementation of the 8259A circuit is slightly bugged here, and the behavior is different from a real IBM AT (in which case my 486 appears to have a more accurate integrated chipset design, as far as the 8259A goes).
    Does anyone have a real IBM AT to test this?
    I have some simple test-code in a Turbo C++ project here anyway, contains both source and binary: https://www.dropbox.com/s/4d31o2aeti...I_286.zip?dl=0

  2. #2
    Join Date
    Jun 2012
    Location
    UK - Worcester
    Posts
    2,721

    Default

    Without me checking the schematic diagrams for your particular machine - I can offer the following observation(s):

    When performing end of interrupt processing, communications is involved between the master and slave PIC(s) (8259 devices). If all of the PICs are interconnected directly - then there tends to be no problem and buffered mode is not required. However, if the PICs are not interconnected directly - then buffers may need to be enabled between the devices to permit communication.

    We use Intel 286/10A CPU boards in a MULTIBUS 1 environment and this is exactly the issue we have. Some of the slave PICs are on different cards to the CPU and (therefore) the MULTIBUS data/address buffers need to be enabled when performing master/slave PIC interactions. If I remember correctly, this is via the not(SP)/not(EN) pin (16) on the PIC.

    You should find (if you peruse the schematic diagram) that the two PICs may not be directly connected to each other.

    Also note that buffered mode needs to be selected for both the MASTER and SLAVE PICs (ICW4=XXXX10XX for buffered slaves and ICW4=XXXX11XX for a buffered master).

    Dave

  3. #3
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default

    Thanks for that input, Dave. Things are slowly becoming a bit more clear.
    I think what is confusing here is that the PC/XT BIOS sets up the single PIC in buffered slave mode (it sends 09h, so 00001001b). I wonder if this is deliberate or not, and if so, what the reason is for that choice.
    The PC/AT BIOS sets both PICs up in unbuffered mode. From what I understood, in that case, the SP/EN pin is used to indicate whether a PIC should be master or slave. So their roles are hardwired, it seems, not programmed by the ICW values.
    I think that's what we see here at page 1-76: http://www.minuszerodegrees.net/manu...2243_MAR84.pdf
    Master has SP/EN to +5v, and slave has SP/EN to GND.

    I also looked up what the 5150 looks like. See page 1-37: http://www.minuszerodegrees.net/manu...2507_APR84.pdf
    Apparently there is indeed a difference in how the 8259A is wired up in the PC/XT. The SP/EN is not just hardwired to +5v or GND here, but connected to some logic.

    That would explain the slightly different setup code.
    Last edited by Scali; December 8th, 2015 at 03:32 AM.

  4. #4
    Join Date
    Jun 2012
    Location
    UK - Worcester
    Posts
    2,721

    Default

    When ICW4 is set as XXXX0XXX (non buffered mode) then the SP/EN pin is an INPUT with '1' meaning MASTER PIC and '0' meaning SLAVE PIC (i.e. hardwired).

    When ICW4 is set as XXXX1XXX (buffered mode) then the SP/EN pin is an OUTPUT driving (mainly) enable/disable logic for data bus buffers. In this case, determination of the PIC as a MASTER or SLAVE is off-loaded to the software by setting this fact in ICW4 (XXXX10XX or XXXX11XX as a slave or master respectively). In this case, the SP/EN output pin is 'active' whenever the PIC is driving its data bus buffers (e.g. outputting an interrupt vector to the CPU). Depending upon 'where' in the circuit the PIC is located, the SP/EN pin could be used to enable or disable the data bus buffers as appropriate.

    If the SP/EN pin has been wired to some logic - ICW4 MUST specify that the PIC be used in buffered mode - otherwise the PIC will assume a master or slave role depending upon the (random) voltage level on the SP/EN pin!

    If the SP/EN pin has been pulled high or low (i.e. it is an input) then non buffered mode must be set in ICW4.

    It's one of these cases where the hardware design dictates how you program the device... Get it wrong - and it doesn't work as designed...

    Dave

  5. #5
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default

    Quote Originally Posted by daver2 View Post
    If the SP/EN pin has been wired to some logic - ICW4 MUST specify that the PIC be used in buffered mode - otherwise the PIC will assume a master or slave role depending upon the (random) voltage level on the SP/EN pin!
    Ah yes, very good. So indeed, the different circuits mean that you MUST have different settings for ICW4 depending on whether you have a 5150/5160 or a 5170 (and hopefully all clones are faithful to these).

    Quote Originally Posted by daver2 View Post
    It's one of these cases where the hardware design dictates how you program the device... Get it wrong - and it doesn't work as designed...
    Yup, so IBM is doing it right with different initialization settings for their different machine designs.
    The problem is that you can't read back the ICW from a running system, so you won't be able to detect what kind of hardware configuration it has.

    The best I can do is detect whether you have a second PIC, and assume that a single PIC is always set up the way a real PC/XT is, and a dual PIC is always set up the way an AT is.
    We can only hope that the clones are faithful to the IBM design, and use their PICs in the same way.

  6. #6
    Join Date
    Jun 2012
    Location
    UK - Worcester
    Posts
    2,721

    Default

    I think that's it...

    Dave

  7. #7
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default

    Another interesting passage in the 8259A manual is this by the way:
    The AEOI mode can only be used in a master 8259A and not a slave. 8259As with a copyright date of 1985 or later will operate in the AEOI mode as a master or a slave.
    My 5160 is a model from 1987, so probably has post-1985 8259A chips as well. I merely flipped the AEOI-bit in the setup sequence from the BIOS. This means it would run in buffered slave AEOI mode. Which only post-1985 chips would do, apparently.
    I guess I should try to run it in buffered master mode.
    Also, I'm not entirely sure why they set it to slave in a single PIC system... or what the difference would be for a single PIC running in master mode as opposed to slave (as far as I understand it, the master/slave setting decides whether the cascade pins are used as an input or output. But with a single PIC, nobody is sending or listening to those pins... the only problem could be in slave mode, when it listens to these pins, but they are not connected, so you get random input. But that is not happening, apparently, because it is running in slave mode by default).
    Last edited by Scali; December 8th, 2015 at 07:00 AM.

  8. #8
    Join Date
    Jun 2012
    Location
    UK - Worcester
    Posts
    2,721

    Default

    I can't answer your specific question - but I may be able to shed a bit of additional light...

    You may be confusing MASTER/SLAVE with single/cascade. ICW1 D1 (SNGL) specifies whether this is the only PIC in the system (single) or multiple (cascade). If single - ICW3 is not accepted (specifiying SLAVE_ID or which master IR has a slave connected). If using buffered mode in a single PIC system - then the M/S bit of ICW4 probably is a "don't care". It has to be either a '0' or a '1' because you are writing a complete byte of course.

    Dave

  9. #9
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default

    Quote Originally Posted by daver2 View Post
    I can't answer your specific question - but I may be able to shed a bit of additional light...

    You may be confusing MASTER/SLAVE with single/cascade. ICW1 D1 (SNGL) specifies whether this is the only PIC in the system (single) or multiple (cascade). If single - ICW3 is not accepted (specifiying SLAVE_ID or which master IR has a slave connected). If using buffered mode in a single PIC system - then the M/S bit of ICW4 probably is a "don't care". It has to be either a '0' or a '1' because you are writing a complete byte of course.
    Ah yes, you're probably right, that makes sense.
    Anyway, I'll experiment with setting the master-bit in ICW4 to see if it works like that on my 5160 as well. And I wonder if setting buffered mode with master will work on the 286.

    Update:
    - Setting buffered mode/master with auto-EOI (send 0Fh to ICW4 instead of 0Bh) works fine on my 5160. This may be more compatible with old 8259A chips, so I will stick to this.
    - Setting buffered mode and correctly setting the first PIC to master also works on my 286 (so using the same 0Bh init as for my 5160, ignoring the second PIC, so no ICW3). This seems to support the theory that the 286 may mirror the behaviour of pre-1985 8259A chips. A number of these probably ended up in real IBM ATs as well, so it is not that crazy, although it seems a bit of an anachronism in a 286-20 with a BIOS date of 07/07/91...
    - In order to reset the system to the proper mode on exit, I still need to have separate codepaths for single-PIC and dual-PIC systems. So I might as well use the 'safer' mode of running the PICs in unbuffered mode on machines with two PICs, so I will keep the two separate versions of auto-EOI initialization as well as resetting back to the initial state.

    Thanks for your help and insights. In this case it really helps to know a bit about how the hardware is wired into the system.
    Last edited by Scali; December 8th, 2015 at 09:20 AM.

  10. #10
    Join Date
    Dec 2014
    Location
    The Netherlands
    Posts
    2,024

    Default

    I've done some more testing and debugging, and I think I've come up with some good code to set up auto-eoi, and verify that it works, for both 8259A's.
    I also made the test capable of detecting old or new 8259A behavior. It detects both integrated 8259A's on my 286 clone as 'old', while it tests 'new' on my other machines. I haven't tested on real pre-1985 8259A chips yet, but I hope someone with the right hardware will do that soon.
    Because the behaviour on my 286 is that in buffered slave mode (either standalone or in a cascaded setup), the AEOI flag is ignored. In non-buffered slave mode, or in standalone buffered master mode, AEOI works. This means I can get both 8259As in auto-EOI mode still.

    The detection routine is quite simple, once you figured it out
    Namely, both 8259A's have a timer attached to them. The first one has the classic 8253 timer, and on AT machines, you have the RTC.
    So I set up these timers to generate periodic interrupts, and I install a handler which increments a counter, but does not fire an EOI.
    If the counter increments more than once, you know that the PIC is performing auto-EOI. Otherwise, no EOI is received, so no new interrupts are sent to the CPU.

    In a cascaded system, you can test for an old 8259A by setting up the first PIC as standalone, and then set buffered slave+AEOI, ignoring the second PIC.
    The second PIC can of course be run in buffered slave mode in cascaded mode.

    I'll have a blog up soon, once I have verified that the old/new 8259A detection works on real chips.
    You should be able to tell by the (C) '85... This is old:

    And this is new:


    What may complicate matters somewhat is that NEC 8259 chips are also commonly used, and they don't have this copyright afaik. I also don't know if they were ever updated to the 1985-spec (or if they even had the glitch in the first place).
    Looking at the datasheet, it seems it's a copy of the Intel one: http://www.datasheetarchive.com/dlma...DSA-478192.pdf
    But it makes no mention of the 1985 thing. So it probably is a pre-1985 clone, and was never updated.
    According to CPU-World, AMD, Siemens and UMC also made them.
    Last edited by Scali; December 14th, 2015 at 04:02 AM.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •