Image Map Image Map
Results 1 to 9 of 9

Thread: ASCII control code usage & syntax Votrax TNT

  1. #1

    Default ASCII control code usage & syntax Votrax TNT

    I had a go at replicating the Votrax Type & Talk from the early 1980's. By using the photos of the pcb (posted on Kevin Horton's website) I duplicated the pcb. I'm am writing a formal article on how this was done using the photos alone. In any case the project has been successful. There is a you tube link showing this pcb working :

    https://www.youtube.com/watch?v=aZCOq2-20ek

    It is interesting that Votrax wired their serial connector neither as a DTE nor a DCE, but a hybrid. However I have established two way communication with my 5155 computer. The idea being to download the translated phonemes to use with my Hero Jr Robots, which also use the SC-01A speech synthesizer IC.

    I have attached the photo of the replica pcb. (The .bin file for the ROM came from Kevin Horton's website, it was retrieved from the MROM in the original unit).

    The question though, is about software switches:

    I have attached the appropriate page from the Votrax user manual.

    So far I cannot get any of the software switches to operate. I am using Procomm terminal emulator in chat mode and I get an echo from the the Type & talk unit (its default state). I'm ok on the hardware side of things and the serial connection/handshaking is fine & connected properly.

    I have yet to find a way to successfully send the escape code and control code to activate the software switches. I have tried using the Xlat table and the keyboard macros in Procomm too. Procomm says the "|" is interpreted as the ESCAPE code and the caret "^" prior to the data for the control codes. But I cannot get this to work, trying multiple combinations of these and with various delimiters.

    So I am badly in need of some help with the syntax (exactly what to type in procomm to send the ESC and control codes ) to operate the software switches. Just say one example to turn off the echo would do, to help me understand the syntax.
    Attached Images Attached Images

  2. #2

    Default

    Since this question got no replies, I thought I would update the progress.

    I had to write a small utility to send the escape and control bytes to the Votrax unit. Basically it had to condition the registers in the 5155's async card (as can be seen from the address, I used COM2, turned out COM1 in my computer has a receive data problem) then get the ASCII key values from two consecutive key presses, into a hex byte and export that to the serial port. I also sent the byte to the parallel port so I could put a logic probe on there to make sure the bytes were as they should be. (I have attached the .asm source file as a .txt in case anyone else could use it for a byte sending job, it opens in notepad) . It just cycles around on itself after two key entries, so it is easy to use, press q to quit. It is primitive but it works.

    It does make me wonder though, when the manufacturers say "send the codes" what software they were thinking would do that at the time.

    Finally I was able to switch the software switches in the replica Votrax unit by sending 1B 14 1B 11, to disable the typed text echo and set the PSEND switch, sending the translated phonemes back to the terminal (that was exciting). Photo attached of the phoneme string returned from the unit.
    (Sounds simple too, but when I first tried going out of Procomm to send the codes from DOS, it malfunctioned when Procomm restarted, so I had to shell out of Procomm to DOS and exit back into Procomm).

    The phonemes strings look like they are the ACSII translations of those strings. Do these look familar to anyone ?

    The next job will be to translate these into a string suitable for sending to a Hero Jr. robot which also uses the SC-01A speech IC.
    Attached Images Attached Images
    Attached Files Attached Files

  3. #3
    Join Date
    Nov 2012
    Location
    Richmond Hill, Ontario, Canada
    Posts
    980

    Default

    While I didn't reply (because it's over my head), I am watching with interest.

    I have both a Votrax Type N Talk (TNT) Model 100 and a Hero-1. I've only ever used straight text to the TNT on a Microsoft Surface using Terraterm through a USB to serial cable for playback and have not needed to go much further. I also ran Scott Adam's Adventureland on an Apple IIe with the TNT and the voice works great. That's as far as i have gone so far. My next task is to try to run Scott Adam's Adventure on a VIC-20. I have not considered anything related to the Hero-1.

    Please continue to update as you go.

  4. #4

    Default

    Hi Snuci,

    I found that when I was programming the Hero Jr. Robot to speak that it was very time consuming to translate text into phoneme strings and I thought it would be great if the text could just be typed and it would automatically be converted to phoneme strings.

    Then I saw the Type N Talk unit and on reading the manual it had a function where the translated typed text (phoneme string) could be echoed or downloaded to the computer. But I couldn't find an actual Type N Talk unit. So I decided to make a replica unit based on the information on Kevin Horton's website, which was an interesting challenge.

    Probably most people using the Type N Talk in the past would have primarily used it in receive mode. That requires a null modem cable. But the odd thing is that this cable would not have worked for two way communication with the computer or terminal, that is to also to send the phonemes back to the computer. It requires a custom cable, though for my pcb I altered the connections so a simple straight cable works.

    I have written a detailed article about the Type N Talk, which is almost finished, when it is, I will attach the link.

    The Type 'N Talk unit is a "baby computer" complete with the 6802 cpu, rom, ram and uart (with a SC-01A speech chip added). So it is quite an impressive device I think. The programming appears to be very clever

    Hugo.

  5. #5

    Default

    Kind if bizarre they use negative acknowledge -- an error state -- and synchronous idle -- a timeout request -- for caps when there's the perfectly good shift-in and shift-out commands. (0x0E and 0x0F respectively). Same for using the device control -- which is supposed to be used to select a different device in a serial chain for output -- for whatever psend is and echo. I would have thought those would be toggles and not direct commands.

    Not too surprising though, data control back then was still the wild west. Fun information all-around!

    But then, I confuse the piss out of modern web developers by often using ASCII control codes in my REST API's instead of JSON or the bandwidth wasting XML.
    From time to time the accessibility of a website must be refreshed with the blood of owners and designers. It is its natural manure.
    CUTCODEDOWN.COM

  6. #6

    Default

    Quote Originally Posted by deathshadow View Post
    Kind if bizarre they use negative acknowledge -- an error state -- and synchronous idle -- a timeout request -- for caps when there's the perfectly good shift-in and shift-out commands.
    Hi deathshadow, maybe you could help me interpret this returned data.

    The Type N Talk manual suggests that the returned phoneme stings could be useful for building up a phonetic word base. But looking at this echoed phoneme data, it hardly looks like it could be useful for a Votrax customer. I was expecting that the returned data would be in Votrax's phoneme symbol form, which, for example, the word OPEN would be; O1,P,I2,N.

    In Hero Jr. programming in Hex bytes, the word OPEN would correspond to 35 25 0A 0D.

    I seems like the start of any new string of the returned data is a C.

    Can you make any sense of it ?

  7. #7

    Default

    Given the limited memory commands are likely bytecoded in its own format or that of the device to which it is sent. They probably at one point had a cross-reference sheet to let you translate what it feeds back, but as you've noticed copies of anything about this hardware is scarce.

    Much like what I did with the Akai EWI USB that didn't document a damned thing:

    http://www.ewiusb.com/sysex_page1

    I would monitor what's being sent into it, what's coming out, and just start throwing known values into the input to see what comes out and look for a common pattern. I would be doing so in HEX since Christmas knows if something like procomm is doing strange stuff / obeying control codes. You might not be seeing the FULL output.

    It seems highly unlikely to me that they would send "O1" -- two bytes -- instead of one byte if they could get away with it, since I highly doubt they have more than 96 phonemes. That C is probably the 'start data' command. Is that trailing X always there? That could be EoD.

    Send it multiple lines of data, keeping it simple one word at a time, then try two words, when you know what the corresponding phoneme output should be. Shouldn't take too long to reverse engineer the resultant output... though in that screencap your having five spaces yet no character repeating itself five times, makes me wonder if you've even got the protocol right. Almost looks like 8N1 output of 7E1 content.
    From time to time the accessibility of a website must be refreshed with the blood of owners and designers. It is its natural manure.
    CUTCODEDOWN.COM

  8. #8

    Default

    It looks like I have found the pattern.

    Every new line of test (that the unit speaks) starts with a C. Spaces are inserted in some cases as an i or ~.

    So I used the single word ready: the data returned was CkB@^ Dropping off the C, and converting this to Hex gives:

    6B 42 40 5E 69

    In Votrax's Speech dictionary there is a phoneme code to phoneme symbol chart.

    The word "ready" has Phonemes that are:

    R, EH1, EH3, D, Y

    From their chart the corresponding hex values are:

    2B 02 00 1E 29.....lining these up beside the echoed ascii converted to hex;
    6B 42 40 5E 69

    So the ASCII converted to hex bytes have had 40 hex added to them! So clearly I am going to have to write a program to deal with this to correct it then a translation table to regenerate the phonemes. Any idea why Votrax might have done this ?
    Last edited by Hugo Holden; July 28th, 2018 at 03:11 AM.

  9. #9

    Default

    I just posted an article in the hardware section about the replicated Votrax TNT. Those returned codes are in a Diphthong format. It is good news that I can convert them to a Phoneme mnemonic form to help program the Hero Jr. Robot.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •