Image Map Image Map
Page 8 of 8 FirstFirst ... 45678
Results 71 to 80 of 80

Thread: What are you guys using for retroprogramming?

  1. #71

    Default

    Quote Originally Posted by Eudimorphodon View Post
    Recently when I was hacking together a couple assembly-language boot-time fixes to run on my Tandy 1000 my positively dreadful workflow ended up being something like this, since I have *no* experience with whatever editor comes with with MASM and it was driving me nuts that I couldn't have both DOS EDIT and a command line onscreen at the same time was:

    1: Loaded up my Ethernet packet driver on the Tandy 1000
    2: Made edits to the source file on my Macbook Pro using bbedit.
    3: Ran "nc -bin listen 1234 > testfile.asm" on the T1000, then "nc {ipaddress} 1234 < working_file.asm" in a terminal on the Mac to push it over.
    4: Assembled it with Microsoft MASM, and tested if it got that far.
    5: Wash, rinse, repeat.

    Not the most efficient thing in the world, but it did mean it was convenient to check the code into github when I was finished. Using DOSbox (at least for the testing part) wouldn't have really worked since these widgets were intended to run from config.sys, and an emulator would have been about as awkward as the netcat thing.

    I've actually gotten a fair amount of use out having the Tandy 1000 sitting there next to my "real" computer as I've been working on a project to build a video card targeted at 8-bit applications like S-100 machines. I've been generating test bitmaps to burn into Flash ROMs, and several times I've used NC to copy them over to the Tandy and there written a trivial BASIC program to load the contents of the .bin intended for ROM and poke it into a graphics screen to make sure I get what I expect.
    I've done similar things, but with the Old Box running terminal software and logged into a linux box with getty running on a serial port. Then copy files back and forth with rz and sz.
    -- Lee
    If you get super-bored, try muh crappy YouTube channel: Old Computer Fun!
    Looking to Buy/Trade For (non-working is fine): Tandy 1000 EX/HX power supply, PS/2 Model 25-286 ISA expansion riser, Mac IIci hard drive sled and one bottom rubber foot, Multisync VGA CRTs, Decent NuBus video card, PC-era Tandy stuff, Aesthetic Old Serial Terminals (HP and Data General in particular)

  2. #72

    Default

    Quote Originally Posted by Alabamarebel1861 View Post
    Reading some of this thread is hard. The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!? I couldn't. My Windows 2000 PC is my friend in all computer things I do. I use Windows 7 rarely for some stupid google "apps" but that's it.
    Back with Windows 2000 and Windows 7, machines were crossing the threshold of "fast enough". Enough CPU, enough memory, fast enough disk to where the computer typically was not in the way of your work. Obviously monster CPU tasks like video encoding, graphics rendering, etc. are the exception.

    Quote Originally Posted by Trixter View Post
    One of my projects that is geared towards older systems is 50,000+ lines of code. It takes my older system 3+ minutes to compile it, and that's with smart compiling/linking. My modern Windows 10 system running an emulators takes 2 seconds to make the entire project. So yeah, I'm going to develop the majority of it on the modern system, and only develop on the older system when it's absolutely necessary (like a speed-sensitive or hardware-unique section).
    Folks don't appreciate how slow the older systems were, and we are absolutely jaded by speed. We're also jaded by modern tooling and "infinite" RAM. In hindsight, it's remarkable software got written at all.

    As I strive to get my SB180 up and running, all of my work has been done on a Z80 simulator running CP/M. I have the simulator cranked down to a simulated 4Mhz, but that doesn't really help with the I/O.

    I'm using Turbo Pascal, and when working on large programs, with multiple includes, and writing to disk, it takes a minute or two to do a turn around. And we're talking < 2000 lines of code. I wish I could get Turbo Pascal 4 for CP/M, I'd really like something like UNITs. I may inevitably switch to something like C, I dunno. And as an OS for development, CP/M is pretty stark. Makes things a little bit interesting. Certainly brings in to light the things we take for granted today.

    My SB180 just has floppies, so that will inevitably be even slower when I get that running and transition over to that, but it also has 256K of RAM, a chunk of which can be used as a RAM disk. So, that will help with intermediate files.

  3. #73

    Default

    Quote Originally Posted by Eudimorphodon View Post
    I couldn't have both DOS EDIT and a command line onscreen at the same time was:

    1: Loaded up my Ethernet packet driver on the Tandy 1000
    ...
    Your idea of using another machine is good, but I wonder maybe it would be easier if the other machine was also a DOS machine, if you have a network. Maybe set up a shared drive native to DOS and run your editor and compiler on one, and the command prompt on the machine you were running on. That would have been nice, but of course I never have looked into that route when I was learning programming on DOS back in the day, and moved on since then.

    Even as a bonus, it would be nice if you could also use the second machine, maybe through some kind of console to single step a program on the executing machine, in the case you can't really use a second monitor.

  4. #74

    Default

    Quote Originally Posted by PgrAm View Post

    What hardware are you targeting?
    What kind of compiler/interpreter software do you use?
    Do you program on old-school systems themselves or do you use a modern editor?
    And (hopefully without starting a flame war) what language(s)?

    Right now, I'm writing for an old AST 286 PC machine.
    I use the OpenWatcom Compiler.
    I use a modern PC with Visual Studio as my editor.
    And I use C++ and some 8086 assembly, with a dash of a custom scripting language.

    How about you guys?
    • Target - 16bit PC
    • Open Watcom and wmake
    • 486sx ( toshiba fanless laptop - it is essential ) - watcom VI editor.
    • C++ and some assembly

  5. #75
    Join Date
    May 2011
    Location
    Outer Mongolia
    Posts
    3,060

    Default

    Quote Originally Posted by the3dfxdude View Post
    Your idea of using another machine is good, but I wonder maybe it would be easier if the other machine was also a DOS machine, if you have a network.
    Yeah. One of these days I've been thinking it would be interesting to set it up so I can NFS or ETHERDFS mount a working directory that's shared on the modern machine so I can skip the "virtual sneakernet" part with nc. It's just sheer inertia standing in the way right there. (I'd definitely try to get over that hump if I were doing this "a lot".)

    (Having the other machine be a DOS machine isn't really a requirement, as long as the editor there can be polite about the differences in text end-of-line markers between DOS/Unix/whatever.)

    Edit: I guess another example of "retro-development" that I did years ago is when I wrote the first creaky version of "PETTESTER" to help resurrect a badly brain-damaged Commodore 2001. That involved developing an alternate ROM image for the machine, so the tools I used were:

    * xa65, a portable 6502 cross assembler.
    * The V.I.C.E. Commodore emulator.
    * Various random online 6502 assembly language educational simulators, which I used to learn just enough 6502 assembly to take a crack at the problem.

    VICE was *pretty easy* to convince to use alternate ROM images, so all I needed to do was compile the code with xa65 with the correct ORG, pad the generated code with enough zeros to fill out the 4K ROM image, and then hack the last few bytes of the image with the correct jump address to redirect execution per the 6502's startup routine.

    It was still pretty surprising when the resulting image worked when burned into an actual 2532 EPROM, of course...
    Last edited by Eudimorphodon; October 25th, 2020 at 11:51 AM.
    My Retro-computing YouTube Channel (updates... eventually?): Paleozoic PCs

  6. #76

    Default

    Quote Originally Posted by Trixter View Post
    I use Turbo Pascal 7 as my Pascal and Assembler IDE. You can assemble from the IDE, and if there are any errors returned, the IDE will let you jump to the line of each error. The only drawback is that TP7's IDE won't syntax-highlight assembler files, so if I want that and I know I'll be writing code for a while, I use the Aurora editor. (Aurora can also be configured to shell+assemble+jump to errors, but I found it convoluted, so I haven't set that up yet.)
    This a good news! I used Turbo Pascal a lot back then, and I am still using it right now for pascal coding. I've tried Borland's Brief for assembler and other languages with mixed feelings: on one hand it is faster than turbo pascal, on the other hand the configuration of compile and jump to error is clumsy and not flexible. You have to set an environment variable for the compile command that you cannot change in editor. I will look at turbo pascal's support for TASM assemble, and give a try to Aurora (I vaguely recall that I may have used it at some point)

    Editing and cross-compiling on a modern machine is definitely faster and more convenient, but I prefer to do everything on the real machine. I rarely work on big programs, tough.

  7. #77

    Default

    Quote Originally Posted by Alabamarebel1861 View Post
    Reading some of this thread is hard. The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!? I couldn't. My Windows 2000 PC is my friend in all computer things I do. I use Windows 7 rarely for some stupid google "apps" but that's it.
    I have no problem at all with that. Back in the old days, it wasn't uncommon to develop games using a higher end system than the target platform. For example, the game studios that could afford it, used the, at that time, ultra expensive 386 and 486 computers, just for making life easier as they could compile the code several times faster than the target machines. They could also use Local Area Networks for communicating different computers, so a computer could compile the code while another one could test it. If something went wrong, only the target machine would crash, so it could save a lot of time.

    They also could take advantage of a more user friendly (and time saving...) graphic environments such as Windows 386 or 3.0, or, at a later time, even Next machines, as ID software did while making Doom. ID Software then threw away their Next computers and used Windows NT for the next projects, being targeted for DOS. Just as we do now with DosBox and PCem, they could have several independent DOS windows instances. I my opinion, Windows 10 is not very different, at all, to Windows NT 4.0. In fact, the program I use for graphics, Macromedia Fireworks 8.0, I think it works on NT 4.0.

    For example, my project is meant to be playable on the average home PCs that were already at homes or could be bought on the fourth quarter of 1990. They weren't 386 or 486, just because they were very, very expensive, so the average PC owner (as my father and I were at the time) could afford an 8088, 8086 or, in a few cases, a 286.

    To develop my project on the real hardware of the time, I would need two 386 machines for developing and compiling, and an 8088 or 8086 machine for testing, equipped with a few graphic cards for testing or, even better, using a card able to emulate several graphic cards, as the OAK VGA OTI-037, that I had at that time. That makes three big computers with their heavy and bulky CRT monitors, plus a composite monitor, o an NTSC television. I also would need to set up a LAN, with a hub, network cards and those RJ cables around my house (my wife would throw me out of the house, LOL). Yes, I could do all the development on one machine but it would be desperately slow. Slow compiling, and slow testing as I would exit the development environment (Turbo C++ for my project), because there would not be enough memory left, and run the result. Loading the TC again, and so on...

    Now I just can the advantages of having all those developing computers, plus some extras, in a small and convenient space.

  8. #78
    Join Date
    Aug 2006
    Location
    Chicagoland, Illinois, USA
    Posts
    6,881
    Blog Entries
    1

    Default

    Real-world example: Sierra's port of Silpheed (1989) was programmed on an 8MHz 286 with EGA, even though more than half of Sierra's target audience were still using 8086/8088 systems at the time of development. It was common, when possible, to use the fastest/biggest machine possible just to speed development.
    Offering a bounty for:
    - A working Sanyo MBC-775 or Logabax 1600
    - Music Construction Set, IBM Music Feature edition (has red sticker on front stating IBM Music Feature)

  9. #79
    Join Date
    Sep 2003
    Location
    Ohio/USA
    Posts
    8,113
    Blog Entries
    2

    Default

    Quote Originally Posted by Trixter View Post
    Real-world example: Sierra's port of Silpheed (1989) was programmed on an 8MHz 286 with EGA, even though more than half of Sierra's target audience were still using 8086/8088 systems at the time of development. It was common, when possible, to use the fastest/biggest machine possible just to speed development.
    Compiling code took a while back then, so you needed the fastest thing you could afford while programming and probably more RAM then most desktop machines as well.
    What I collect: 68K/Early PPC Mac, DOS/Win 3.1 era machines, Amiga/ST, C64/128
    Nubus/ISA/VLB/MCA/EISA cards of all types
    Boxed apps and games for the above systems
    Analog video capture cards/software and complete systems

  10. #80

    Default

    Just for anyone who may be interested on VSCode, I stopped using it about one week ago. The main reason, the more and more frequent blue screens of death. That's reason enough to stop using an application: it's so frustrating, so damaging that I think it doesn't require a deeper insight. The other reason is that, even when the application didn't crash my entire system (and believe me, working crossing fingers all the time isn't pleasant...), it still was very heavy and slow to load.

    The good news is that while looking for an alternative, I found a VSCode fork named VCCodium. I learnt that VSCode source code is open source, but the binaries distributed by Microsoft don't. Microsoft includes several default extensions, not all of then open source, not even every of then necessary at all, and they activate telemetry by default. So VSCodium is a fork compiled just from the official VSCode repository, but with telemetry disabled by default and free of all Microsoft bloatware.

    Bottomline, now I'm very happy with VSCodium. It's still slower to load than Notepad++ but it's way faster than the official VSCode, loading now in a reasonable time. And, most important of all for me, it never crashed during all this time I've been using it. My guess is that the guilty of the blue screens and extremely slow load were the numerous unnecessary default preloaded extensions of VSCode, or maybe it was the telemetry, or both, I don't know.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •