PDA

View Full Version : Origin of timing jitter in output port signals 8088 computer.



Hugo Holden
February 10th, 2017, 04:16 PM
When writing programs to output data to a printer port, on a vintage 5155 computer, if I set up the simplest possible program loop to set the port data bits high & then low & loop. The output waveform has a low frequency timing irregularity (phase jitter) of perhaps a few hertz. And this jitter seems about the same interval, regardless of how fast the program executes. For example I have attached a scope images of it running in usual GW-Basic, very slow being an interpreted language and running in compiled Quickbasic which is much faster. But the phase irregularity (disturbance) in the signal timing is the same in both cases. Probably a video of the scope screen would have been better.

It is as though at a low frequency interval the processor is being briefly interrupted from its simple program loop by an I/O device, or could this be the consequence of electro-mechanical jitter from the hard drive ? Does anyone know, or can guess at, the origin of this timing disturbance ?

(This phase jitter would be of no consequence for data exchange etc, but can be for another application where the signals were used for another purpose).

gslick
February 10th, 2017, 04:52 PM
It is as though at a low frequency interval the processor is being briefly interrupted from its simple program loop by an I/O device, or could this be the consequence of electro-mechanical jitter from the hard drive ? Does anyone know, or can guess at, the origin of this timing disturbance ?


At a minimum a standard real mode DOS PC will have the timer interrupt handler executing in the background at an 18.2 Hz rate.

Hugo Holden
February 10th, 2017, 05:33 PM
At a minimum a standard real mode DOS PC will have the timer interrupt handler executing in the background at an 18.2 Hz rate.

That would explain it perfectly thanks. Is it possible to disable that ?

Chuck(G)
February 10th, 2017, 05:47 PM
Of course; several old-school games do just that. You have to keep in mind that the time-of-day clock will be affected, however. However, if you have an independent RTC chip in your setup, you can always reset the time-of-day to correct for the loss.

Hugo Holden
February 10th, 2017, 06:26 PM
Of course; several old-school games do just that. You have to keep in mind that the time-of-day clock will be affected, however. However, if you have an independent RTC chip in your setup, you can always reset the time-of-day to correct for the loss.

Chuck(G),

Thanks, I have an independent RTC.

How would I actually disable the Dos timer interrupt handler ?

Chuck(G)
February 10th, 2017, 06:59 PM
The simplest way would be to mask out IRQ 0 in the 8259. You could do it thusly (if memory serves)

mov al,1
out 21h,al

This is from a decaying memory, so you may want to check my work.

To re-enable, you'd do:

in al,21h
and al,0feh
out al,21h

Scali
February 11th, 2017, 01:38 AM
Another source of jitter would be the memory refresh. You cannot disable this, except under very specific conditions (reading from memory will at least refresh that row of memory, so as long as you execute code, there will be some memory being read and thus refreshed, but it's very tricky).
On an original 8088 PC, this refresh is done via the PIT (counter channel 1) and DMA controller (channel 0). The PIT is programmed to trigger a byte read every 18 ticks (which would be 18*4 = 72 CPU cycles at 4.77 MHz). The byte will take 4 CPU cycles to read.
This means that your CPU will not be able to read/write memory for those 4 cycles, which may cause a slight delay in the fetching and/or execution of an instruction.

This is of course a far smaller amount of timing jitter, and for your purposes it may not be relevant.
But one 'workaround' could be to reprogram the memory refresh to a number of ticks that is in phase with the code you're trying to execute, so that the refresh cycles always occur in the same place in your code. Eg for 8088 MPH we did this in two places. One is the mod player at the end of the demo. The other is the Kefrens bars effect, where the CRTC was being reprogrammed every 2 scanlines. By setting the refresh to 19 ticks instead of 18, you'd effectively get exactly 4 refreshes on every scanline, so the refreshes were now 'in sync' with the routine reprogramming the CRTC.

A last source of jitter could be waitstates being inserted on the bus by certain hardware. The most obvious one being the video controller. The output circuit needs to periodically read a byte from the framebuffer to output it to the screen. If you try to access the framebuffer at the same time, the video card will put wait states on the bus to delay the CPU until the output circuit has completed its read.
These waitstates are very hard to predict.

Hugo Holden
February 11th, 2017, 03:08 AM
The simplest way would be to mask out IRQ 0 in the 8259. You could do it thusly (if memory serves)

mov al,1
out 21h,al


Chuck, I tried this (in Debug) but no effect at all on the issue, I had to add an interrupt at the end (just tried int 3 or int 20) to help the program terminate, whether that sabotaged the masking of bit 0 in the 8259 I'm not sure.

Hugo Holden
February 11th, 2017, 03:13 AM
Scali,

Thank you for that information, it is a good example of the other tasks the processor has to attend to, aside from executing the code. So it might not be a fixable issue. Where and how is the memory refresh re-programmed ?

Scali
February 11th, 2017, 05:27 AM
Where and how is the memory refresh re-programmed ?

The PIT channel 1 is initialized to single-byte rate generator (mode 2, binary).
To change the refresh interval, all you have to do is write a new countdown byte to port 41h (port 40h is for channel 0, 41h for channel 1, 42h for channel 2).
Eg, to set it to the default value of 18 ticks, you can just do this:
mov al, 18
out 41h, al

Hugo Holden
February 11th, 2017, 04:54 PM
The PIT channel 1 is initialized to single-byte rate generator (mode 2, binary).
To change the refresh interval, all you have to do is write a new countdown byte to port 41h (port 40h is for channel 0, 41h for channel 1, 42h for channel 2).
Eg, to set it to the default value of 18 ticks, you can just do this:
mov al, 18
out 41h, al

Scali,

It does appear to be the memory refresh primarily responsible, when I reduced the value from 18 to 2, it slowed it right down to a burst of disturbance of about 8 jitters every 500 mS or so. If I set it lower than 2, to 0 for example, of course, it locks up the computer with a parity error.