• Please review our updated Terms and Rules here

How to get output from SIO with Intel 8251A USART

IBM_User

Experienced Member
Joined
Jun 11, 2012
Messages
220
Location
Switzerland
I need to now how I get output from my IMSAI SIO-2 board.

First, I initialized the board:

4E 40 4E 37

After that and reset, I toggled in:

DB 03 E6 01 CA 08 00 3E 41 D3 02 C3 08 00

To get an stream of "A" on the therminal, then second restet and RUN.

Unfortunately nothing displays on the screen of the terminal.

So I added an LED to the TX line to see if I have output - nothing.

Can someone me help in this case?

Here is an link of the IMSAI SIO board and the manual:
http://www.s100computers.com/Hardware Folder/IMSAI/SIO/SIO.htm


Thanks in advance

SIO.jpg
 
Last edited:
Code notwithstanding - where was the TX line where you applied the LED? at the card or the other end? If at the other end, are you sure it's not wired opposite of your expectation (DCE instead of DTE or vice versa) in which case you'd see the data on the "RX" side instead of the TX?
 
First, I initialized the board:

4E 40 4E 37

After that and reset, I toggled in:

DB 03 E6 01 CA 08 00 3E 41 D3 02 C3 08 00

To get an stream of "A" on the therminal, then second restet and RUN.

Unfortunately nothing displays on the screen of the terminal.

So I added an LED to the TX line to see if I have output - nothing.

View attachment 13172

1) not sure of your initialization.
2) your code: please attach mnemonics next time.
LOOP:
IN 03
AND 01
JZ LOOP
LD A,41
OUT 02
JMP LOOP

it appears that your board is jumpered correctly for 9600 baud rs232 operation.

if you single step does the in 03 exit from the top loop?

Kipp
 
@TX_dj, at the card, vice versa advisedly.

@kyeakel, sorry, but I'm not so familar with mnemonics.
 
Ah, the "Neanderthal" Intel 8251--just like the "Neanderthal" 8255--both with their quirks.

Kipp, the reason for the odd initialization is that the 8251 uses the same port to set the mode and as a command register. After a reset (either hardware or software), the 8251 is in a "mode" state. Unfortunately, the mode initialize sequence for synchronous mode is 3 bytes long, so if some garbage accidentally gets sent as a mode command, everything gets screwed up. So the first pair of bytes serves to flush the possible synchronous mode initialization and the third byte actually performs the async initialization. The fourth byte is a command byte.

@IBM_User, I don't see anything seriously wrong, although I'd probably change the 37 command to 27; since you're not looking for error flags anyway, why bother to reset them?

At this point, I should probably ask if you've set up the handshaking properly. Since you've set up TxEnable in your command, the CTS line must be set true (i.e. at the chip, CTS/ pin 17, must be low. Otherwise, the 8251 won't transmit. The naming of the command bit is a little misleading...
 
Anything what I do, nothing will work.

@Chuck(G), I changed the 37 to 27, but no succes.

I connect only pin 2, 3 and 7 (TX, RX and GND) to the terminal (ADM3A) in vice versa.

I think something else with the initialization of the 8251 is wrong, unfortunately I did not have further now how in programming 8080 machines.
 
Okay, let's go back to stage 1.

First of all, I assume that you've read the SIO-2 manual? How are the myriad jumpers set up on your board?

Have you tried the simple echo program in the back of the manual?
 
I read the manual over and over again.

Jumper:
A1 wired for ASYNC
A3 wired for RS232 (both channels computer end)
A8 wired for RS232
A11 wired for 9600 baud (both channels)
C7 wired for 0-F
D6 wired for I/O mapped

With the echo programm from the manual I don't have any output from the board.
 
Last edited:
Okay, let's see if the 8251 is actually working. Instead of running an infinite loop, modify your code to send say, 256 characters and then halt. You should be able to easily see a delay between the time that you start the code and when it halts.

If that works, and the code doesn't simply hang, you need to sniff down the line from the 8251. If you've got a logic probe, go back to running your original program and see if the TxD line on the 8251 is wiggling. It's quite possible that the EIA driver (75180? -- I haven't looked at the schematic yet ) is bad--in the old days, they were pretty susceptible to static discharge damage.
 
Let’s start at the beginning!

It looks as though your ‘hex bashing’ and understanding of the instruction set at the hex level for the processor is pretty good – so I will stick with that for my code so that I hope I won’t confuse you! This takes me back to my youth!

If you can get into assembler programming at the mnemonic level – you will find this much easier and less error prone. You can download pretty good cross-assemblers for Windows or Linux (or whatever your flavour of operating system is) and use your favourite text editor to create the assembler programs, assemble them (and even test them) and then download the resulting binary or hex file to your IMSAI machine for final running. The main advantages are that you can thoroughly comment your program, the assembler handles all the addresses for the jump instructions for you and when your program cr*ps out on your IMSAI you go back to your PC and fix it quite simply. Anyhow, that’s for another day – back to your problem…

I assume you have downloaded, read and understand the technical manual for the Intel 8251 chip? If not, download it from “http://www.datasheetcatalog.org/datasheet/Intel/mXtyswx.pdf”. The name of the PDF file looks a bit funny – so this might be a “once only” download. If so, just do a search for the 8251 datasheet from Intel.

The first thing is not to hit RESET at any point! If the RESET button is wired to the reset pin of the 8251 (which it probably is) hitting reset will cause any previous initialisation to be thrown away. If you are setting up the 8251, then pressing reset and expecting it to work you will be very disappointed…

I agree with Chuck that there could be some cr*p sent to the 8251 after reset but before you actually send it something in rare cases – although I would say that this is a design flaw that you are working around.

I am also confused by your initialisation sequence.

I am assuming here that you are starting your programs at address 0000.

0000 3E <MODE> ; LD A,<MODE>
0002 D3 03 ; OUT (3),A
0004 3E 37 ; LD A,<COMMAND1>
0006 D3 03 ; OUT (3),A
0008 3E 27 ; LD A,<COMMAND2>
000A D3 03 ; OUT (3),A
000C DB 03 ; WAITLP: IN A,(3)
000E E6 04 ; AND <TXEMPTY>
0010 CA 0C 00 ; JZ WAITLP (WAIT FOR TXEMPTY)
0013 3E 41 ; LD A,<ASCII_A>
0015 D3 02 ; OUT (2),A
0017 C3 0C 00 ; JMP WAITLP

(I am a Z80 nut so my mnemonics are in Z80-speak).

So, what does my program do?

<MODE> should be setup to suit your desired number of stop bits, parity, parity enable/disable and character length. Obviously, this value must be setup correctly to match your terminal. The key thing is that the last TWO BITS of <MODE> should be set to ‘10’ (binary). This sets the UART to asynchronous mode (rather than synchronous mode) and divides the externally-supplied baudrate clock by 16. This is the usual default. If you change this value, then the indicated baudrates may not be as expected. So, to set up 2 stop bits, EVEN parity, partity enabled and 8 data bits you would have a <MODE> value of ‘FE’

<COMMAND1> sets Request To Send (RTS), clears the error flags, enables the receiver, sets Data Terminal ready (DTR) and enables the transmitter.

<COMMAND2> does the same but disables the reset of the error flags. Issuing two commands like this is the correct initialisation procedure as this configures the 8251 and resets any errors. If you are writing a receive routine then you should be looking for any parity, framing or overrun errors if they exist and take action accordingly.

I look at bit 3 (TxEMPTY) rather than bit 0 (TxRDY) in my test.

So, my hex code for the above code (loaded into location 0000) would be:

3E FE D3 03 3E 37 D3 03 3E 27 D3 03 DB 03 E6 04 CA 0C 00 3E 41 D3 02 C3 0C 00

The next thing is the hardware handshake lines. I generally interconnect the hardware handshake lines if they are not in use. On a 25-way D connector this would be to link pins 4-5 (RTS [Request To Send] / CTS [Clear To Send]) and pins 6-8-20 (DSR [Data Set Ready] / CD {Carrier Detect] / DTR [Data Terminal Ready]). Do this at the SIO connector end. If the 8251 is expecting the Clear To Send signal before it transmits anything – and it isn’t actually wired up – then the 8251 will wait for a very long while! Linking 4-5 (RTS/CTS) and driving RTS in software (bit 5 of the COMMAND byte) should fool the 8251 into transmitting something.

Before wiring up the terminal I would actually check that the terminal is happy with only TX, RX and GND being wired up. Strap just pins 2 to 3 on the terminal and check that when you type on the keyboard that the characters you type actually appear on the display. If so, remove the 2-3 link and make sure the characters stop when you type some more!

The other thing to check is to make sure that transmit of the terminal is wired to receive of the SIO and vice-versa. Use an LED (with a series resistor) or a multimeter to find the the ‘strong’ source of voltage. This should be the transmit line in each case. I use a green LED in inverse parallel with a red LED and both LEDs in servies in series with a 330 Ohm resistor. Connect one end to pin 7 (ground) and use the other end as a sensor lead. One or the other of the LEDs should light up when it is connected to the transmit line. If my description doesn’t make sense I can post a schematic diagram.

Don’t forget that I/O ports 2 and 3 are for SIA ‘A@ and not ‘B’ (i.e. is the terminal connected to the correct connector).

After that we are looking into either a major hardware fault or a major configuration fault. I assume you already know if the I/O space between 00 and 0F is unused apart from the SIO board?

If you have a debug monitor on your S100 board that can input and output bytes to selected I/O ports then you can always use the debug monitor to output bytes manually rather than under program control.

Hope this is of some help to you.

Dave
 
Dave, the reason for the weird initialization sequence is a "just in case" dodge.

Suppose someone has spit a single byte tot he 8251 command port that puts it into sync initialization. Well, to get back to command mode, you have to spit 2 more initialization bytes at it.

Now, suppose that somehow, two initialization bytes in sync mode have been issued. The 8251 is still waiting for the third byte before it enters command mode.

So you send out 2 bytes that can complete the sync initialization sequence without doing anything strange to the 8251. That means that if the 8251 needs only 1 byte that the second byte you send doesn't upset things. At the same time, both bytes, regardless of the 8251 state are harmless if the 8251 is already its initialization state.

As I said, a "Neanderthal" chip. The Signetics 2651 was a far superior device.
 
Sorry Chuck - perhaps I wasn't too clear in my post.

It is not the initialisation sequence I am confused about it is the significance of the four bytes as presented (4E 40 4E 37).

The next byte sequence is a fully-constructed program which can be disassembled. The four bytes presented don't appear to mean anything on their own without any further context. So, if I assume that the four bytes are an initialisation program, when disassembled I get:

LD C,(HL)
LD B,B
LD C,(HL)
SCF

which is clearly garbage...

So, I must assume that the four bytes described are sent to the 8251 in the order presented to the control port. This does make logical sense when you look at what is trying to be achieved - with the issue that you have correctly observed about partial initialisation (which I fully agree with you) and the standing "reset error flags" bit after the last byte is sent.

It is (however) an assumption on my part that the four bytes identified were actually sent to the 8251 control port. I would have liked to have seen the actual initialisation code fragment that was used for myself to be sure.

Dave
 
UPDATE

@daver2, thanks for your post #10, very helpful and informativ!

First, I rewrite my initialisation code for the 8251A as follow:

3E 4E ; MVI,A 4EH "Dummy" mode byte
D3 03 ; OUT 03
3E 40 ; MVI,A 40H "Dummy" command byte for reset
D3 03 ; OUT 03
3E 4E ; MVI,A 4EH "Real" mode byte for 8N1 @ 9600
D3 03 ; OUT 03
3E 27 ; MVI,A 27H "Real" command
D3 03 ; OUT 03

Still nothing!

OK, time for troubleshooting:

SIO1.jpg

Let's see if it is proper enabled, is it... check..., no!
The output on the NAND doesn't go low, but why??

SIO2.jpg


IBM_User
 
Last edited:
UPDATE 2

I toggled the test program from the manual in (starting at 0000), and then hit EXAMINE followed by single step, the program looks like that:

3E 4E
D3 03
37 3E
27 D3
03 27
DB 03
FF E6
02 CA
08 37
DB 02
FF D3
02 FF
C3 08
37

But, that is not exactly what I'm toggled in.

From the manual:

3E CA changed to 4E for 8N1
D3 03
3E 27
D3 03
DB 03
E6 02
CA 08 37
DB 02
D3 02
C3 08 37

The memory boards works fine, I don't think I have a memory issue.
 
Last edited:
What is wrong?! With Single Step the output goes low, with EXAMINE followed by RUN - nothing happens!
 
Ok, program for starting at 0000:

3E 4E
D3 03
3E 27
D3 03
DB 03
E6 02
CA 08 00
DB 02
D3 02
C3 08 00

Now, the INP indicator comes on, but the 8251 is again not enabled.

The output of C8 goes not fully low, the "high" led on the probe is bright and the "low" led is a bit glowing.
And at the CS pin on the 8251 same case.

Why that? When I single step the program, the "high" led goes off and the "low" led goes on.
 
Last edited:
Remember that when you're running, the "chip enable" event is very short in comparison to all of the other stuff going on--so you will see a dim glow on the "LOW" LED. That's normal. I assume that your logic probe doesn't have a pulse detector (i.e. a separate LED that blinks in response to a pulse of any length). That makes things a bit harder, it that's the case.
 
Back
Top