PDA

View Full Version : Testimonies of using BASIC back in the day



dmemphis
January 29th, 2017, 02:15 PM
http://softwareengineering.stackexchange.com/questions/149457/was-classical-basic-ever-used-for-commercial-software-development-and-if-so-ho

tezza
January 29th, 2017, 02:47 PM
Interesting reading the comments under that link. Thanks for posting.

I was a dab hand at BASIC in the day (1982-1988 ). At that time it was generally sneered upon by those in the computer science community for being unstructured, inelegant and allowing spaghetti code...a "toy" language. There is some truth in this, but I can't help thinking there was snobbery also. There were many variants of BASIC but even with 8k BASIC you could write understandable code. It was a matter of being disciplined and commenting extensively (where RAM allowed it). Once we moved past the basic (8-12k) BASICs into GW-BASIC and QuickBASIC, structural elements were there (e.g. WHILE...WEND etc.). You also had enough RAM to comment well.

I wrote some highly useful in-house programs for my work that did the job exactly as they should, and weren't hard to maintain. It was an Everyman's language.

BASIC I salute you!

Tez

Chuck(G)
January 29th, 2017, 03:59 PM
Remember CBASIC? Lots of commercial software written in that.

(Modesty forbids) designed and wrote a multiuser BASIC for serious business applications that was still in use up until a few years ago. Initially, it started out by implementing the MCBA small business application suite, but was eventually turned toward tasks such as word processing. It was blazing fast; faster than even Microsoft compiled BASIC, even though it was interpreted P-Code.

Forbidden64
January 29th, 2017, 06:27 PM
I started when I was a wee kid typing in basic on the 64 with my brothers. I was only 3 when we got it...sadly i was only around 12 when it died.
Later though! I found Qbasic 4.5! I had a ton of fun with that, and still have a lot of my old software hanging around somewhere. The language was still in use by some pretty serious people making some amazing software. It was mostly hobby stuff until DirectQB came out. Then the language had a bright flash, and then died around 2005, 2006...mainly because, I think, direct QB took away some of its hobby charm, perceived approachableness to beginners, and increased vastly, the disparity between new hobbyists and seasoned programmers. The QB RPGs are still a great play, and there is one site that is still up with all the tutorials, releases, software reviews, and great links. I have very fond memories of commodore basic (which I still use), and qbasic which I actually made an inventory system for my garage with a few years ago! The entire thing fit onto one 3.5" disk, on a stripped down computer with no hard drive, booting MSDOS 5. It even has a screen saver!

dmemphis
January 29th, 2017, 06:36 PM
I agree that BASIC was very useful despite its limitations.
The small interpreted basics, the Microsoft variants which I'm most familiar, were
hampered by only two significant character variable names and unparameterized
gosubs. But I still learned a lot with them, especially on the C64! :)
I didn't get to use the better BASICs due to having to move on to C in those days.
Still fond memories.

AlexC
January 29th, 2017, 06:59 PM
Snobbery certainly played a part. I started on the BBC Micro in the early 1980s so I'm obviously biased, but the direct, simple access to the hardware was great (as, if I'd ever mastered it, was the embedded assembler). I've gone on to write lots of code in other languages and I still come across sneering about BASIC from people who think programming started with JavaScript or Python.

BASIC was simple, logical and fast (enough). I wrote a QuickBASIC program to control the analytical stage of a dark matter detector in the early 1990s. It was more than fast enough to handle the incoming data stream. I even included an 'Easter egg' in the form of a space invaders clone. Don't think my supervisor ever noticed.

A couple of years ago I wrote a simple GW-BASIC program to help my kids learn their times tables. The structure was logical enough that with a bit of help they could work out what each line did. I really don't think that's the case with most other languages.

KC9UDX
January 29th, 2017, 07:09 PM
You want snobbery? Try being a Pascal programmer. All kinds of C snobs always preaching the supposed superiority of C.

I never found the limitations of even the interpreted BASICs terribly limiting. In fact, I found the more advanced BASICs, with their lack of direct and indirect memory addressing more limiting.

Sure, BASIC, especially the interpreted ones were terribly slow, and memory hungry, but they still get the job done!


https://youtu.be/iPB0EdPrYAQ

Mind you, for all the things I've accomplished with it over the decades, I still don't like BASIC..

dmemphis
January 29th, 2017, 09:18 PM
Here is one more piece I came across recently that was a good read too.
http://www.nicolasbize.com/blog/30-years-later-qbasic-is-still-the-best/

Plasma
January 29th, 2017, 11:21 PM
I started when I was a wee kid typing in basic on the 64 with my brothers. I was only 3 when we got it...sadly i was only around 12 when it died.
Later though! I found Qbasic 4.5! I had a ton of fun with that, and still have a lot of my old software hanging around somewhere. The language was still in use by some pretty serious people making some amazing software. It was mostly hobby stuff until DirectQB came out. Then the language had a bright flash, and then died around 2005, 2006...mainly because, I think, direct QB took away some of its hobby charm, perceived approachableness to beginners, and increased vastly, the disparity between new hobbyists and seasoned programmers. The QB RPGs are still a great play, and there is one site that is still up with all the tutorials, releases, software reviews, and great links. I have very fond memories of commodore basic (which I still use), and qbasic which I actually made an inventory system for my garage with a few years ago! The entire thing fit onto one 3.5" disk, on a stripped down computer with no hard drive, booting MSDOS 5. It even has a screen saver!

I really don't think DirectQB had anything to do with the "demise" of QuickBASIC. It was just another assembly lib. If anything DQB helped level the field between the veterans and beginners due to its ease of use.

I think the main cause was declining DOS compatibility in Windows. XP had poor sound card support and timing/CPU usage issues, Vista had no support for DOS graphics, and 64-bit versions of Windows couldn't run DOS programs at all. There were hacks and workarounds but it was too big of a hassle for most people. As FreeBASIC started maturing more people moved to that; then QB64 came out and the rest moved to that.

Scali
January 30th, 2017, 01:48 AM
I don't recall a lot of commercial/professional software being written in BASIC back in my early C64/MS-DOS days.
However, in Windows, Visual Basic was a very popular option, and quite a few well known programs were developed in VB.
A lot of companies also used Visual Basic for Applications as an advanced scripting language for adding functionality to their Office documents and such. You'd be surprised how advanced and mission-critical some of that stuff was/is.

But yes, for me too, BASIC was the first contact I had with a computer. My first computer was a ZX81, which started in BASIC, like so many home computers of the day (call me crazy, but what I love about the original IBM PCs is that they can boot directly into BASIC, unlike any clone).
My second computer was a Commodore 64, and it worked exactly the same way. My neighbour had an Atari 8-bit machine, and again, same thing. Friend with an MSX? Same, goes directly to BASIC.
It wasn't until later when I started using MS-DOS and Amiga systems that I even realized a computer doesn't necessarily have to boot into BASIC.
To me it felt somewhat limiting, because I had been doing some simple programming on the ZX81 and C64 from time to time, and now I had computers where I had to load a program first, before I could begin programming.

KC9UDX
January 30th, 2017, 02:18 AM
I don't recall a lot of commercial/professional software being written in BASIC back in my early C64/MS-DOS days.

I sure do. At least, on the Apple ][ and C64, much commercial software that I had was either published in BASIC, or a combination of BASIC and machine language. Most was cleverly obfuscated so that it wasn't obvious.

I did that with a lot of my software, too. The first line (10) of a BASIC program would call a machine language location, where there was a $60 (RTS) waiting. The second line (20) was a REM with a copyright notice and a bad character that would halt the listing. The next line would have an unusual line number, like 1053, where another REM with a bad character was waiting. The program content would start in BASIC at 1054, and the first order of business was to display a splash screen whilst loading the necessary ML code which would be called occasionally.

Avia
January 30th, 2017, 02:38 AM
"You want snobbery? Try being a Pascal programmer. All kinds of C snobs always preaching the supposed superiority of C."

I'll second this one. Like BASIC, I think Pascal became stereotyped as a "beginner's language" as it was pervasive in the educational system for teaching structured programming. I wrote several applications using Borland's Turbo Pascal which at the time had a very well supported library of tool kits. I really enjoyed Pascal and never struggled with any of its (very few) limitations.

Scali
January 30th, 2017, 03:13 AM
"You want snobbery? Try being a Pascal programmer. All kinds of C snobs always preaching the supposed superiority of C."

I'll second this one. Like BASIC, I think Pascal became stereotyped as a "beginner's language" as it was pervasive in the educational system for teaching structured programming. I wrote several applications using Borland's Turbo Pascal which at the time had a very well supported library of tool kits. I really enjoyed Pascal and never struggled with any of its (very few) limitations.

In the demoscene, Turbo Pascal was actually quite a popular language in the DOS days. Probably because it had very fast compile times even on modest machines, and it had good integration with the inline assembler and TASM.
Even the legendary Second Reality contains some Turbo Pascal code: https://github.com/mtuomi/SecondReality

I wouldn't be surprised if it was also popular in game development for the same reasons.

krebizfan
January 30th, 2017, 04:53 AM
Business Basic had a surprising hold on the accounting market. The company I worked for in the early 90s was still upgrading some very old Business Basic applications which had been ported to MS-DOS. Fortunately for me, no one there recognized that the Wang Basic I used a decade earlier was related to the Business Basic they were using.

essay
March 11th, 2017, 05:38 AM
Remember CBASIC? Lots of commercial software written in that.

Rick Ethridge
March 11th, 2017, 12:59 PM
I have QB 4.5 on my Tandy 1000 SL. It runs pretty fast on a V30 and 87!

Chuck(G)
March 11th, 2017, 01:12 PM
I don't recall a lot of commercial/professional software being written in BASIC back in my early C64/MS-DOS days.

The commercial use of BASIC almost pre-dates microcomputers. Consider, for example, MCBA (http://www.mcba.com/), founded in 1974 running business applications on DG, DEC, HP and TI minis (MCBA = "minicomputer business applications"). All in BASIC. I have in my library many thousand lines of their source code. Probably beats those written in DiBOL in terms of quantity.

There may be earlier examples of major commercial use of BASIC.

krebizfan
March 11th, 2017, 02:01 PM
Chart Master for the Apple II and then IBM PC was probably the major piece of software for micros publicly acknowledged to have been written in Basic. I think there were some other software written in BASIC for other micros but all I can recall right now is Scriptor (type-in word processor) and some games renowned for their mediocrity.

KC9UDX
March 11th, 2017, 03:47 PM
Many serious applications for the C64 (I know many of you reading are snickering, but it's true despite the silly "toy" status assigned to these machines by those who didn't have them) were coded in BASIC with parts in assembly where speed was critical.

Chuck(G)
March 11th, 2017, 04:10 PM
BASIC on the Apple II was used for crypto work--I still have a copy of Mike Lauder's "Prime Factor Basic" for the Apple. It includes such things as modular math on very large strings.

At Durango, almost every application was written in BASIC, aside from the OS and the BASIC compiler itself. Even multi-user word processing.

geneb
March 14th, 2017, 06:52 AM
In Pick, "Data BASIC" was/is the core programming language. Very popular for ERP and accounting systems back in the day and still hanging on today - my work in Pick pays my mortgage. :)

g.

Unknown_K
March 14th, 2017, 07:25 AM
However, in Windows, Visual Basic was a very popular option, and quite a few well known programs were developed in VB.
A lot of companies also used Visual Basic for Applications as an advanced scripting language for adding functionality to their Office documents and such. You'd be surprised how advanced and mission-critical some of that stuff was/is.

It is kind of shocking how much you can do with a spreadsheet and Visual Basic for Applications. I think quite a few companies made use of that software to get around hiring external programmers.

Anybody use Visual Basic for DOS?

krebizfan
March 14th, 2017, 10:16 AM
I don't remember anyone using Visual Basic for DOS. Most of the VB code I was familiar with was business front ends to databases which were very difficult to build with VB-Dos because of DOS's memory limits.

Chuck(G)
March 14th, 2017, 11:01 AM
In Pick, "Data BASIC" was/is the core programming language. Very popular for ERP and accounting systems back in the day and still hanging on today - my work in Pick pays my mortgage. :)g.

The Microdata Reality-based Pick system was very popular in the insurance industry back in the day. I tend to think of the PICK system more as a file system which used a dialect of BASIC to access it--ISTR that very early versions used something that resembled PL/I more than anything.

geneb
March 14th, 2017, 11:30 AM
I use D3 Pick, which is a direct descendant of Pick AP/R84, etc. Rocket Software now owns the product and they produce a number of MV db products - UniVerse, mvBase, mvEnterprise. Northgate Information Systems still sells Reality - it runs under Linux (and likely Windows) now.

Pick was NoSQL before it was cool. (hell, Pick was NoSQL *before* SQL! :D )

g.

GeoffB17
May 7th, 2017, 01:46 PM
My first computer, about 1982, was an Epson HX-20, which had a fair BASIC built in, and a year later I got the add-on TF-20 disk drive, which was well supported by a basic extension. Did quite a bit with that. Still have everything, although just not the HX and the TF are not communicating (but hopefully fixable).

On the back of that, I became my work computer expert. Local Council Planning Dept. When the whole council got an IBM /36 the system did NOT have anything for us, so I used the /36 Basic Interpreter to create some software, which (I think) works pretty well. That system did have some extras for structure, I was especially impressed with an 'Input Fields' structure whick allowed screens to be processed as a lump, both for input and display.

As others have said, you can write bad code, and good code, with just about any language. Any BASIC is really what you make of it.

I worked on some CBASIC code that was handling all the Accounts functions for the State of Maryland Courts system. Was that 'mission critical'?

Sincy my BASIC days I went into various combinations of dBASE/Clipper/C, but I still use basic from time to time. If I need a quick and easy prog to manipulate a file, then QB is as good as anything. And it's pretty fast on modern machines.

I've played with FreeBasic, and it's pretty impressive, and it's still BASIC as I've always known it. Handy for doing maintenance/conversion things with MySQL databases.

Remember the old saying - 'A Bad Workman Blames His Tools'!

Geoff

Chuck(G)
May 7th, 2017, 03:01 PM
Well, if you paid attention to John Kemeny(deceased) and Thomas Kurtz (still kicking, the last I checked), the only "real" BASIC is TrueBASIC (http://truebasic.com/).

The others are illegitimate pretenders. In some respects, I have to agree that Visual BASIC is so far distanced from Dartmouth BASIC, that it isn't BASIC at all...

Rick Ethridge
May 7th, 2017, 07:49 PM
How 'bout ZBasic?

KC9UDX
May 7th, 2017, 09:13 PM
Well, if you paid attention to John Kemeny(deceased) and Thomas Kurtz (still kicking, the last I checked), the only "real" BASIC is TrueBASIC (http://truebasic.com/).

The others are illegitimate pretenders. In some respects, I have to agree that Visual BASIC is so far distanced from Dartmouth BASIC, that it isn't BASIC at all...

But wasn't Kemeny the one who said that BASIC should never be limited to one operating system, and isn't True BASIC now limited to only running on Windows? And is it true the rumour that True BASIC disables multitasking in Windows; the very language that promoted multitasking at Dartmouth?

Trixter
May 7th, 2017, 09:30 PM
http://softwareengineering.stackexchange.com/questions/149457/was-classical-basic-ever-used-for-commercial-software-development-and-if-so-ho

I'm mildly surprised nobody mentioned this:


https://youtu.be/seM9SqTsRG4

Disclaimers: I'm in it, not at my best. Also, it's watered down for a more general audience. Also, it's very C64-centric. But it's "testimonies of using BASIC back in the day" incarnate, so it's on-topic.

KC9UDX
May 7th, 2017, 09:32 PM
Here's another way BASIC is useful:

http://www.youtube.com/playlist?list=PLGLyuIfnYGgLshBnflXhYqHuAm2QYcDrp
I used to make a living designing and building industrial controllers based on these things.

bear
May 7th, 2017, 09:42 PM
Tangentially related, some months ago I started a benchmarking project for 8-bit home micro BASIC systems, calculating prime numbers by trial division (the goal was not to have an efficient way of finding prime numbers, but to have some consistent unit of work that could be expressed simply and therefore be implementable regardless of limitations in any particular system's BASIC).

At some point I thought I'd try my hand at an APL version to compare the speeds of BASIC and APL in my IBM 5100, and the project started spiraling out of control from there.

ZBasic, Dartmouth True BASIC (on the Macintosh), various Microsoft BASICs, even Atari 2600 BASIC are all represented (that last one was a challenging port).

It's pretty clear to see that Microsoft BASICs were all pretty slow, that Woz's Integer BASIC is pretty tidy, and Acorn's BBC BASIC is a masterwork.

I still have a few more language ports up my sleeves in various unfinished stages, a structural mistake in the DRI Personal BASIC version (which has also snuck into a couple other ports) that causes it to do more work after finding the last prime, which I need to correct both in the source and the published results, and a long list of results to gather using software and hardware I've already collected with ports I've already written.

Mostly the micro results are in the bottom half.

http://www.typewritten.org/Articles/Benchmarks/primes.html

Chuck(G)
May 7th, 2017, 09:58 PM
When I get a chance, I'll have to run your benchmark on the BASIC that I wrote for the Durango systems. I've still got a couple 850s around. CPU is approx. 3 MHz 8085. The company is long gone, but a 16-bit version was ported to Xenix and was still in use in a couple of places about 5 years ago.

The weakness in a lot of BASICs isn't the basic math, particularly not integer math, but in character manipulation. Some were hideously inefficient.

Also, you have to consider the numeric representation. One big difference between some finance-oriented BASICs is that numeric representation is floating-point decimal, not binary. When working with decimal currencies, the occasion for truncation error is much less with decimal (that's one of the reasons that COBOL implements it to this day as the default).

It seems to me that there was a benchmark list from the 1970s that was essentially open-ended. Machines like a CDC 6600 got to compete--even though it used only a 10MHz clock, it was stunning to see how much faster it was than just about anything.

Trixter
May 7th, 2017, 10:17 PM
Acorn's BBC BASIC is a masterwork.

If any BASIC I had access to as a child/teen allowed inline assembler, it would have likely changed my life.

KC9UDX
May 7th, 2017, 10:34 PM
Mine too, but not for the better, I don't think. I wouldn't have had the desire to write my own assembler, which really affected everything I did after that point.

Scali
May 8th, 2017, 01:09 AM
Also, it's very C64-centric.

Well, given that the C64 is the best-selling computer of all time, I suppose it would follow that the C64 is also the best-selling BASIC implementation of all time?
Or at least, in the 80s, when BASIC was at its peak in terms of relevance. So I would suppose that the majority of people who grew up with BASIC, grew up with C64 BASIC?

In fact, by the time I finally got my own Amiga, it was an Amiga 600 with Workbench 2.0. And unlike the 1.x incarnations, AmigaBASIC was no longer included as part of the OS.
I suppose AmigaBASIC always took more of a backseat on the Amiga compared to the C64, since it was a separate application that you had to boot up first, after first booting the OS. Neither of which was required for most software, since they would be on self-booting floppies, and especially games generally didn't use the OS at all.

Scali
May 8th, 2017, 01:13 AM
It's pretty clear to see that Microsoft BASICs were all pretty slow, that Woz's Integer BASIC is pretty tidy, and Acorn's BBC BASIC is a masterwork.

Wouldn't that at least partly also depend on the design choices with regard to data types?
Because as I recall, the C64 BASIC didn't have separate integer and float datatypes internally, so everything would be processed through floating point routines, which would make math operations relatively slow.

So I suppose what I'm saying is that it would be interesting to have both integer and floating point workloads, and see how the two relate in performance. Perhaps on some BASICs, the floating point versions are much slower, where on others, there's no difference.

KC9UDX
May 8th, 2017, 02:13 AM
Wouldn't that at least partly also depend on the design choices with regard to data types?
Because as I recall, the C64 BASIC didn't have separate integer and float datatypes internally, so everything would be processed through floating point routines, which would make math operations relatively slow.

So I suppose what I'm saying is that it would be interesting to have both integer and floating point workloads, and see how the two relate in performance. Perhaps on some BASICs, the floating point versions are much slower, where on others, there's no difference.

Applesoft programs run faster when using integer types (%). How much faster I don't recall, but it would be easy to benchmark.

Commodore BASIC programs run significantly faster if you avoid explicitly using numbers. It takes a long time to parse numbers. So if you assign frequently used numbers to variables and only use the variables, execution speeds up. If I remember right, there isn't a significant improvement doing this in Applesoft.

The Commodore BASIC maths routines are much slower than the other Microsoft dialects, if I remember right. That's due to a much higher precision used in Commodore BASIC. I remember running into that whilst porting a program from Commodore BASIC to Applesoft. It seems to me also that there was something odd about it, that Commodore BASIC has a higher amount of error in some routines due to rounding errors, which shouldn't happen with the higher precision.

krebizfan
May 8th, 2017, 05:07 AM
A lengthy set of benchmarks published in 1982 can be found at https://works.bepress.com/mwigan/14/ which includes Apple II+, Commodore PET, and many Z-80 variants.

The comparison ranges from the CDC Cyber 171 (fastest) to TRS Pocket Computer (slowest) with some amusing notes like the Seattle System 2 (8086) was faster than PDP-10 and IBM System 34.

vwestlife
May 8th, 2017, 06:41 AM
All software released for the TRS-80 Pocket Computer was written in BASIC.

http://www.trs-80.org/images/pocketcomputer.jpg

Chuck(G)
May 8th, 2017, 08:09 AM
A lengthy set of benchmarks published in 1982 can be found at https://works.bepress.com/mwigan/14/ which includes Apple II+, Commodore PET, and many Z-80 variants.

The comparison ranges from the CDC Cyber 171 (fastest) to TRS Pocket Computer (slowest) with some amusing notes like the Seattle System 2 (8086) was faster than PDP-10 and IBM System 34.

There are older ones than that. The Cyber 171 isn't even the fastest of the 170 series. It might have been interesting including, say, a 176.

deathshadow
May 11th, 2017, 04:06 PM
At that time it was generally sneered upon by those in the computer science community for being unstructured, inelegant and allowing spaghetti code...a "toy" language.
Most of those claims I felt came later -- the spaghetti code one I heard a lot and my response was always "Oh, nothing like Assembler..."

JMP = GOTO, Jx = IF/GOTO, CALL = GOSUB, RETURN = RET...

But that's the same as the bullshit you got from C asshats who kept saying C was "closer to how assembly worked" which of course is 100% BULLSHIT. Never understood where they got that claim from but Christmas on a cracker it's parroted a LOT, even today.


You want snobbery? Try being a Pascal programmer. All kinds of C snobs always preaching the supposed superiority of C.
Which was aggravating when their "not my favorite language" bullshit was filled with misinformation, what little truth there was to it was a decade or more out of date, and their favorite pet language was needlessly and pointlessly cryptic, aggravatingly vague, and pretty much DESIGNED to make developers make mistakes.

Again, there's a reason I'm not entirely convinced this is a joke:
https://www.gnu.org/fun/jokes/unix-hoax.html

But then, in the circles I was in during the late 70's through to the early '90's, NOBODY gave a flying **** about C on anything less than a mainframe platform. It just wasn't used and I really wonder what planet other people who say it was used were on... From what I saw it sure as shine-ola wasn't used on any MICROCOMPUTER platforms until the 386 came along for any serious project since C compilers were fat bloated overpriced toys -- relegating them to being about as useful as the toy that was interpreted BASIC.


I never found the limitations of even the interpreted BASICs terribly limiting.
That's really where BASIC got the "toy" label IMHO, it was FAR too limited in speed to do anything I wanted to do on any platform I ever had in the 8 bit and even early 16 bit era.

For me in the late '70's and early '80's was looking at every language higher than assembly and realizing "This is for lazy ****'s who don't want to write real software". Yes, even Pascal got that label from me. Interpreters were too slow on the hardware I had access too for anything more complex than DONKEY.BAS, compilers cost thousands, came on more floppies than I had drives, and took an hour of swapping disks just to compile a "hello world" that wouldn't even fit into a COM file. (since unlike the effete elitists with deep pockets we didn't have hard drives and were writing software to run OFF floppy drives or even cassette from systems that only HAD floppies.)

... and I still remember the compiler that flipped that attitude around 180 degrees, and anyone who knows anything about '80's compilers can guess EXACTLY what compiler for which language I'm referring to.

Of course, my recent adventure into trying to use C for a PCJr target only further drove me away from C... to the point I was walking around downtown muttering "****ing gonna shove C up K&R's arse" under my breath. I still say C exists for the sole purpose of perpetuating the myth that programming is hard; a kind of elitist circle-jerk to exclude certain types of thinkers from even having a chance in the field. How the hell it caught on as the norm or even desirable still escapes my understanding -- though admittedly I say the same thing about *nix and posixisms so... YMMV.

In any case, by the time BASIC matured away from line numbers and had compilers, there was NOTHING it offered I couldn't get from better faster compilers that made smaller faster executables with cleaner code syntax. I wasn't likely to migrate away from Turbo Pascal for QuickBasic, that would have been thirty steps backwards. It would be like migrating back to clipper after having moved from DBase3 to Paradox or Access.

BASIC was a cute toy, but pathetically crippled to the point of being near useless for writing any real software in the timeframe it was "standard in ROM". EVERYTHING I ever encountered for "commercial" software written with it reeked of ineptly coded slow as molasses in February junk.

Though there are some ... I don't know how to word it. Something about certain software from the 'look and feel' perspective makes my brain scream "cheap junk". I knee-jerk into "what is this crap" mode and most always it seems related to the language being used to build the program. BASIC has always had that for me where you can usually TELL its BASIC, and the only way around that seems to be to lace it so heavily with machine language, you might as well have just written it with assembly only. Clipper was another one that just gave me this "rinky cheap half-assed" feeling... which for all my hatred of C, I never got that feeling from stuff written in C.

That 'feeling' lives on today quite well the moment I see anything web related written using ASP. You can just tell, the way the UI feels half-assed, flips the bird at the WCAG, and the agonizingly slow page loads from too many DOM elements, pointless code-bloat crap visual studio just slops in there any-old-way, etc, etc... Whatever it was that made BASIC feel like a rinky cheap toy seems to live on in the majority of what people create using the WYSIWYG aspect of Microsofts Visual Studio.

You'd almost think one of VS's build options is to use VB.NET

deathshadow
May 11th, 2017, 04:11 PM
notes like the Seattle System 2 (8086) was faster than PDP-10 and IBM System 34.
I still remember a magazine article from the '80's that was pointing out that TP3's 48 bit "real" faster and more reliable on a 8088 than the 32 bit "single" on a 8087... it was just more RAM hungry. Wish I could find a copy of that today.

krebizfan
May 11th, 2017, 06:41 PM
I still remember a magazine article from the '80's that was pointing out that TP3's 48 bit "real" faster and more reliable on a 8088 than the 32 bit "single" on a 8087... it was just more RAM hungry. Wish I could find a copy of that today.

There were a lot of bugs in the TP 8087 support which caused inaccurate results. The code also ran slower than it should have on 8087. Conversely, code written using 48-bit reals moved to Delphi will slow down by a factor of about 5 since Delphi converts 48-bit reals into standard floating point and then converts back into 48-bit.

KC9UDX
May 11th, 2017, 08:09 PM
How often does anyone actually use FP? Even when I was rendering 3D wireframes I always used scaled integers.

But then one of my biggest gripes about most Pascal implementations is the standard datatypes. I tend to go out of my way to not use them. That was one of my main reasons for starting to write my own compiler what is now many years ago (that project has been shelved due to priorities). I wanted to write a Pascal that natively used null terminated strings (yes, I know there are some Pascals that already do this), didn't have any legacy MS-DOS junk in it, and didn't have any inbuilt procedures that violated the rules of the language. (I had great intentions of not having a WriteLn for one).

For all my complaints about BASIC, I do not complain about GOTO. I never understood the hatred of GOTO, especially from people who call themselves Computer Scientists. GOTO (JMP, whatever you care to call it) is a very useful tool. For every 10,000 lines of structured code that I write, I tend to use one Goto at least once, somewhere. There are certainly times and places where it makes sense. And I tend to find the same people who eschew GOTO are the same who have no problem overuseing or misusing a Break/Exit/whatever you care to call it.

I still stand by what I said; all the limitations of BASIC don't prohibit a programmer from writing usable software.

My favourite thing about BASIC (perhaps the only thing I like about it) is simply that it is included in ROM in so many machines. The fact that I can write an assembler (and bootstrap an assembly flat file editor) with nothing other than a C64 the way it came from Commodore, a disk drive (only to avoid the obvious tedium of not having one), and a black and white TV, is what makes hobbyist computing worthwhile to me. Mind you I write of my assembler, but there were countless other large scale projects that I did, and countless more I could have done. Sure I could have done it in machine language, with toggle switches, or in crude assembler in a monitor, but BASIC is sure more friendly than either of those for a large project.

I'm going to have to take down one of my interiour doors so I can stage a picture of what my computer desk looked like in 1986. A desktop made from a door with a C64C, a 1541, and a Penncrest TV (you can see that on my Youtube channel, doing nothing).

krebizfan
May 11th, 2017, 10:04 PM
Half the code I dealt with in the late 70s involved floating point so I guess I have a slightly different perspective. I wanted accurate results with speed.

GOTO/GOSUB with line numbers has a major problem. A well structured BASIC program would have each subroutine contained in its own section and gaps of unused line number between sections. If the programmer incorrectly estimated how many line were needed for a subroutine, that could lead to hours of renumbering since early BASICs lacked automatic renumbering utilities. Or the programmer created ultimate spaghetti jumping to an unused group of line numbers before jumping back so a given routine might be broken into 4 or 5 randomly located pieces.

Standard Pascal has no MS-DOS junk. I don't remember how the extended standard handled the end of a variable length string. It was a great improvement on the original Wirth design but not as good as the UCSD implementation which influenced Blue Dolphin (Turbo) Pascal. Nothing like needing to write a text editor to get a decent string implementation added.

cthulhu
May 12th, 2017, 01:14 AM
How much of BASIC's bad reputation is due to Microsoft's numerous low-quality implementations of it?

KC9UDX
May 12th, 2017, 02:21 AM
Half the code I dealt with in the late 70s involved floating point so I guess I have a slightly different perspective. I wanted accurate results with speed. But what were you doing that you needed that? I've just never run into a situation where I really needed FP.


GOTO/GOSUB with line numbers has a major problem. A well structured BASIC program would have each subroutine contained in its own section and gaps of unused line number between sections. If the programmer incorrectly estimated how many line were needed for a subroutine, that could lead to hours of renumbering since early BASICs lacked automatic renumbering utilities. Or the programmer created ultimate spaghetti jumping to an unused group of line numbers before jumping back so a given routine might be broken into 4 or 5 randomly located pieces.Right, but this is a problem with numbered lines, not a problem with GOTO in general.


Standard Pascal has no MS-DOS junk. I don't remember how the extended standard handled the end of a variable length string. It was a great improvement on the original Wirth design but not as good as the UCSD implementation which influenced Blue Dolphin (Turbo) Pascal. Nothing like needing to write a text editor to get a decent string implementation added.At the time, I was dealing with FPC, for AmigaOS 4. It is the only Pascal at the time that would produce native OS4 executables (probably still is). It generates just horrible code, which is very PC-centric, and a good deal of a programmer's time is spent adapting to the OS.

Turbo Pascal and any Turbo Pascal derivitive that I've used is rife with MS-DOS artifacts (I can't speak for Delphi). I may have used Standard Pascal, at some point, I don't recall. I've used several Pascals that had no, or almost no string capability at all.

Agent Orange
May 12th, 2017, 06:12 AM
For what ever reasons I like BASIC. I'm not a programmer, but I can make things happen once in a while with BASIC, just to suit me. My memory is a little fuzzy, but a long while back, didn't Bill Gates have a standing $25,000 offer/bet (charity of course) that you could choose your language and he would write BASIC and then proceed to whip you like the proverbial red-headed stepchild?

Jimmy
May 12th, 2017, 07:50 AM
I owned a copy of Visual Basic for OS/2 from Microsoft. I purchased it retail from Babbage's in the mall at Ft. Walton Beach, Fl.

Chuck(G)
May 12th, 2017, 08:32 AM
I'll check in here with some tales from the past--and how I see things.

In my mainframe days, the bulk of my programming was done in assembly. Many thousands of lines, all keypunched. The experience taught me two lessons--the value of coding standards and the value of a really good macro assembler. Coding standards, obviously for maintenance--realize that this was before on-line editors--you sat with a listing with statement sequence numbers/identifiers and worked out directives for the source library program to make your changes--as in "delete these statements, insert these statements, etc." At the same time, you knew that there were other people writing directives, perhaps on the same section of code you were working on. Good documentation and coordination/communication were essential and paid off handsomely.

A good macro assembler would allow you to do just about anything that you could imagine. Remote/deferred assembly, character manipulation, syntax extensions, macros that define macros all could simplify something that would be a nightmare in straight assembly to something that a human could understand. It's a shame that not very many assemblers exist today that can do the same.

About the only other language that I used back then was FORTRAN--you could find it on just about any platform--at least, I know of no mainframe where it wasn't offered. You used that to write utility programs, if possible, where peculiar machine features or speed of execution. Some FORTRANs were very good indeed, being able to allocate register use and schedule instructions as good as the better assembly programmers.

BASIC wasn't an option back then--the language was too limited and usually was interpreted, not compiled.

I moved to very large vector systems with the emphasis on number-crunching. Huge instruction set with instructions like "SEARCH MASKED KEY BYTE" with up to 6 operands. For that, I used a derivative of FORTRAN called IMPL--and also made changes to the compiler to improve code generation. If you had something specific in mind, there were ways to express assembly instructions inline.

At about that time, I built my first personal microcomputer from a kit (Altair 8800). I'd been following the action at Intel and still have the notes from the 8008 announcement, faded though they may be. A disk was out of the question, so I used an audio tape recorder and the guts from a Novation modem for offline storage. It worked, so I didn't have to toggle things in, or type them from the console. BASIC was one of those programs that I typed in the hex code for, byte by byte. It worked, but not quickly--interpreter, again. So I used a memory-resident assembler which worked for a time. Eventually, I put together a system with Don Tarbell's disk controller and a couple of 8" floppy drives that I scrounged. It wasn't too long before I got CP/M 1.4 (or thereabouts) going, which gave me more possibilities for software development. But still assembly.

Professionally, at about the same time, I took a job with a startup and used an Intel MDS-800 running ISIS-II. Intel had a language that was vaguely reminiscent of PL/I called PL/M-80. It wasn't bad--you could actually make good use of its capabilities, although it was not an optimizing compiler in any sense, so the size of the executable code and its speed wasn't up to assembly. For "quick and dirty", however, it was great.

Eventually, as disk systems got affordable, other languages made their appearance. Various flavors of BASIC (few were true compilers--and there's a reason for that), FORTRAN, COBOL, SNOBOL4, FORTH...you name it. Anyone remember DRI's ISV program that promoted their PL/I? Yes--a remarkably feature-rich PL/I for the 8080. There were Cs--but they weren't all that good, for a very good reason:

The 8 bit Intel platform lacks certain features that makes C practical. C uses a stack architecture, derived from the PDP-11 architecture. The PDP-11 is a 16-bit machine, the 8080 is not. Addressing of local stack-resident variables on a PDP-11 is quite straightforward; on the 8080, it's a nightmare. Among other things, the 8080 doesn't have stack-relative addressing, nor does it have indexed addressing. 16-bit addresses have to be calculated the hard way--move the stack pointer to HL, load another register pair with the index, add it to HL, then access the variable byte-by-byte. Really ugly. While the Z80 does have indexed IY and IX addressing, it's also quite limited and handling simple 16-bit integer stack-resident variables, particularly if the local area is more than 256 bytes long, is again, very complicated. You simply can't generate good C code on an 8080. FORTRAN--sure. No stack-resident variables--in fact, no stack required at all. That's why FORTRAN could be run on an 8KW PDP-8, but no such luck for C--C imposes certain demands on the architecture.

Comes the 8086 in 1979 and the later, the IBM PC. All of the sudden, things get less complicated, although handling large (more than 64KB) data structures is quite awkward. But for the first time, you had a microprocessor with an ISA that could do justice to C. Disks were relatively inexpensive, so you had a full-scale development system. Assembly could be used to write fast and/or small programs, but for the tedious stuff, C was great. Microsoft even endorsed it--and they didn't have a C compiler at the time. They recommended the use of the Lattice C compiler--a basic K&R thing that did the job.

BASIC made sense for business applcations--I wrote a BASIC incremental compiler (to P-code) for a company to port the large suite of MCBA applications to an 8085. There was a good reason for the P-code thing: If you were to write a compile-to-native code BASIC, you'd wind up with a program full of code that did little more than set up arguments to subroutines to do the basic operations. At best, the 8080 could do inline 16-bit arithmetic as long as you didn't need to multiply or divide, but BASIC originally had no explicit type declaration statements. You had number and you had strings. The other problem was that 8080 code is not self-relocating. P-code results in smaller programs, location-independent code and even multitasking. The result can be quite small and fast.

As far as languages go, from a compiler-writer's viewpoint, they're all the same at the back end. You take a tree or other abstract representation of the compiled and optimized source code and you translate it into native instructions, perhaps doing some small optimizations. What the front-end eats isn't important. I've been on projects where the same back-end was used for C, FORTRAN and Pascal.

My perpetual gripe with C is that it lacks a decent preprocessor. For some odd reason, preprocessor directives are considered to be evil by the C community. Yet, look at PL/I's preprocessor, complete with compile-time variables, conditionals and other statements. Incredibly useful, if you know how to use it. Yes, C++ has features that make a preprocessor less important, but there you get the whole complex world of what amounts to a different language, when all you wanted was a way to write a general macro to initialize an I/O port. There were times when I've found C++ quite handy for abstracting things, but I like the simplicity of C.

So, for the last 20-odd years, I've written a lot of C, with a smattering of assembly support. But much more C than assembly. And almost no BASIC, FORTRAN or COBOL oor Ada at all--but I'd use any of the above if there were an advantage to using it in any particular application.

deathshadow
May 12th, 2017, 01:11 PM
How often does anyone actually use FP? Even when I was rendering 3D wireframes I always used scaled integers.
You'd be shocked by 3d programming from pretty much 3rd generation Pentium onwards. The age of the MMX and the 3DNow, the time of the sword and axe is nigh, the time of the wolf's blizzard. Ess'tuath esse!

Somehow some math nerds who knew jack shit about programming got together and convinced EVERYONE in the 386 era that matrix multiplies were somehow more efficient and effective than the direct math for translations, rotations, and so forth. HOW they managed to convince people that 64 multiplies of 32 memory addresses into 16 more addresses was faster than four multiplies, three addition and one subtraction of 4 addresses into two I'll never understand... That matrix math started being used for TRANSLATIONS (what should be three simple addition) was pure derp... but it got worse...

As 1) everything moved to floating point, and 2) rather than argue it, processor makers created hardware instructions to do it. You know MMX? 3dNow? That's about ALL those do! Hardware matrix multiplies shoving massive amounts of memory around just to do a rotation or translation.

Pretty much by the time Glide was fading, all 3D math on PC's is floating point, typically double precision. OpenGL? DirectX? Vulkan? Double precision floats. Even WebGL in the browser does it now, and they had to change JavaScript to add strictly typecast arrays (to a loosely cast language) to do it! Though that change has opened the doors to doing a lot of things JavaScript couldn't before, making it even more viable as a full stack development option.

You can't even argue it now with 'professional' game programmers even when the situation calls for something matrixes and normal projections can't handle as they are so used to "the API does that for me". Implementing things like arctangent polar projections (which with a lookup table at screen resolution depth can be many, MANY times faster even CPU bound over a gpu projection) are agonizing to implement because the rendering hardware just won't take the numbers unless you translate it all from polar to Cartesian, a process that eliminates the advantages.

Laughably I wrote a game engine about twenty years ago that used GLIDE (3dFX's proprietary API which really didn't do much 3d, it was just a fast textured triangle drawing engine) built ENTIRELY in polar coordinates -- until the view rotation that was in fact handled as a translation -- using 32 and 64 bit integer math that in a standup fight could give the equivalent rendering in OpenGL on a 'similar performing' card that had hardware 3d math a right round rogering.

If you're working on the CPU and dealing with off the shelf 3d model formats now? Double precision floats. If you're working on the GPU through a major API? Double precision floats.

Which for a LONG time left ARM crippled or at the whims of the GPU (which laughably STILL aren't even up to snuff with Intel HD on processing power) until they added the option for a "VFP" extension -- vector floating point; which is a big fancy way of saying MMX on ARM. It's bad enough a ARM Cortex A8 at 1ghz delivers integer and memory performance about equal to a 450mhz PII (since they are more obsessed with processing per watt than processing per clock) when you realize that things like webgl or OpenGL ES want to work in double precision floats, and since there is no floating point in hardware on a stock ARM PRIOR to Cortex A8 and it's optional on A9's you're looking at 487 scale performance in that regard. (thankfully VFP and SIMD extensions are now commonplace, but a LOT of cheaper devices still omit them)

Even more of a laugh when you realize most low end ARM video hardware is just overglorified 20 year old Permedia designs with faster clocks shoved at it.

Part of why without a major overhaul, now that Intel is gunning for that space ARM could be in for a very rough ride in the coming years. VFP is a stopgap at best, even the best offerings in Mali OpenGL ES video for ARM gets pimp slapped by even piddly little Intel HD on some of the new low wattage Celerons. The only real hope ARM has moving forward is existing momentum and if nVidia's new low power strategy for desktop/notebook trickles its way down into the Tegra line.

... and honestly I wouldn't hold my breath on that, I get the feeling nVidia is starting to consider walking away from the mobile space even if their "shield" technology relies on it. It hasn't been the success they hoped for.

krebizfan
May 12th, 2017, 03:55 PM
Floating point might not have made much sense in games since displaying partial pixels is not beneficial. In scientific software, it was common to go with floating point with as many bits of accuracy as possible. Sometimes a good idea, sometimes it just meant the PDP-11 ran all weekend.

deathshadow
May 14th, 2017, 08:20 AM
Floating point might not have made much sense in games since displaying partial pixels is not beneficial.
Until you get into anti-aliasing, sub-pixel hinting, etc, etc...

Scali
May 14th, 2017, 08:51 AM
Games just used whatever method was fastest at the time.
In the early 2d era, it often made most sense to just use integer coordinates, and work with a coordinate system that maps 1:1 to the pixel grid on screen.
With more advanced stuff (scaling/rotating 2D and such, 2.5D or real 3D), you would need additional precision over the screen resolution. So you want some kind of solution that can handle fractional coordinates as well.
Obviously, before FPUs were commonplace, full floating point wasn't very efficient. So games used fixedpoint notation (basically just integers scaled up by a certain power-of-2 value, to get fractional precision).
Another advantage of using integers is that they are very predictable and numerically stable. There's no fancy scaling or rounding that can affect precision in unwanted ways. So if you are writing some kind of rasterizing routine (doing eg a linedrawing or polygon routine), an integer-based solution will be guaranteed to render consistently, and touch all intended pixels.

But as FPUs became commonplace, it became more efficient (and flexible) to perform certain calculations with floating point.
In games there's a pretty obvious transition-point: In the era of DOOM, Descent and such, everything was still done with fixedpoint integers (the 486 made the FPU commonplace, but it wasn't a very efficient FPU, so you'd avoid it like the plague for high-performance calculations). Then Quake came around, and a lot of calculations were done with floating point (Pentium happened, and its FPU could do single-precision operations like fmul and fdiv much faster than integer mul and div.. And perhaps more importantly: the FPU instructions could run in parallel with integer ones. For perspective divide it would fire off one fdiv for every 16 horizontal pixels. The fdiv would effectively be 'free' because it ran in the background while the innerloop was outputting textured pixels. By the time it had rendered 16 pixels, the fdiv was completed and the result available on the FPU stack).

Chuck(G)
May 14th, 2017, 09:20 AM
...and let's not forget the fixed-point DSPs. Despite lack of floating point, they are/were quite useful.

cthulhu
May 14th, 2017, 01:26 PM
Some corrections for deathshadow: MMX is for integer/fixed-point operations, not floating-point. Single-precision maths is typical for most non-scientific GPU-based work, not double. Double-precision maths is supported by GPUs but it's avoided since it performs at far less than half the rate of single-precision operations in all but the highest-end GPUs, i.e. ones not intended for gaming. In ARM processors VFP has been supplemented by NEON which performs much better with vector operations than VFP does. The Cortex-A8 implements NEON well but has a crippled VFP unit compared to the one in the A9. Even pre-ARMv7 processors, such as the ARMv6 ARM11 used in the original Raspberry Pi greatly outperform it.

Chuck(G)
May 14th, 2017, 03:37 PM
Let's hope that RISC-V makes some headway. I'd hate to think that we'll turn into an ARM world.

MikeS
May 15th, 2017, 08:21 AM
Half the code I dealt with in the late 70s involved floating point so I guess I have a slightly different perspective. I wanted accurate results with speed.
The exact opposite of almost all the code I dealt with; in the business world floating point was generally slower and less accurate because it tended to introduce rounding errors.

Chuck(G)
May 15th, 2017, 09:39 AM
...which is one of the reasons why spreadsheets initially implement decimal floating-point math--and why CBASIC did, for example.

krebizfan
May 15th, 2017, 10:58 AM
The exact opposite of almost all the code I dealt with; in the business world floating point was generally slower and less accurate because it tended to introduce rounding errors.

Alas, electron orbits do not lend themselves to easy decimal notations.

MikeS
May 15th, 2017, 11:58 AM
Alas, electron orbits do not lend themselves to easy decimal notations.

Indeed; just pointing out that what's fast and accurate in your realm (science, games etc.) is slow and not necessarily accurate either in the (often ignored here) business realm.


...which is one of the reasons why spreadsheets initially implement decimal floating-point math--and why CBASIC did, for example.

'Precision as Displayed' is set by default in all my Excel sheets; it's disconcerting when 'IF A1=2' fails, or when FoxPro says that 1 + 1 = 1 for that matter...

GeoffB17
May 20th, 2017, 01:14 PM
Surely that can be a problem with ANY computer language.

Over the years, I've had problems with various, including the C7 I've done most of my 'serious' work with. Where such things mattered, I always took the precaution of using a special rounding function on both sides of the comparison to make sure that if the two numbers were the same, then the computer recognised them as such. If I didn't, there was always a possibility that one number or the other might still have a stray (VERY 'stray') 0.000000000001 or such-like hanging in there!

Geoff

Chuck(G)
May 20th, 2017, 01:52 PM
Note by "decimal", I don't necessarily mean that a number is expressed in, say, BCD, although that would make for convenience.

Simply maintaining the exponent as a power of 10 rather than 2 (or 16, as in S/360 floating point) is sufficient. While most have heard of IEEE 754 floating point, few are familiar with IEEE 854. At any rate, interest in decimal radix is very much alive (http://speleotrove.com/decimal/). Somewhere, even recall reading about a decimal coprocessor done in FPGA.

Students studying numerical methods are often given a problem that involves trig functions near their limits usually expressed as a quotient. The naive learner simply codes the expression as stated and discovers that the result is pure garbage. It's an object lesson in not blindly trusting the computer to come up with the "right" answer.

MikeS
May 20th, 2017, 02:01 PM
What Chuck said.

Quoting from his link:

"Most computers today support binary floating-point in hardware. While suitable for many purposes, binary floating-point arithmetic should not be used for financial, commercial, and user-centric applications or web services because the decimal data used in these applications cannot be represented exactly using binary floating-point".

As someone who spent 40 years or so with systems and software for the accounting and financial services markets, that's a truth I learned very early on... ;-)

Chuck(G)
May 20th, 2017, 02:31 PM
I'm trying to remember an article from the HP Journal from years back. As best as I can recall, it was a way of doing decimal floating point by expressing the mantissa in groups of 20(?) bits, each group having the range from 0-999999 in binary, or something to that effect. The benefit was that you could get 6 digits of significance, where doing the same as BCD would only get you 5. Does anyone remember the article?

Scali
May 20th, 2017, 03:10 PM
Students studying numerical methods are often given a problem that involves trig functions near their limits usually expressed as a quotient. The naive learner simply codes the expression as stated and discovers that the result is pure garbage. It's an object lesson in not blindly trusting the computer to come up with the "right" answer.

That reminds me of an issue that many raytracers have:
If you calculate the intersection of a ray and some surface (sphere, plane, cylinder, etc), then the intersection point is never exactly *on* the surface of course, due to the limited precision.
So your intersection point is either in front or behind the surface, depending on which way the rounding turned out.

Now, if you don't take this into account, and then proceed to reflect the ray at the intersection point, you will often find that your rendered surface will have 'holes': random black pixels.
Why? Simple: when you reflect your ray, you implicitly assume that it is bounced against the surface, so it should be on the 'outside'. But if due to rounding your intersection point was actually on the 'inside' of the surface, then the reflected ray will bounce back to the surface again, and the light may get 'trapped' inside the object for the remaining 'bounces' to be calculated. That's why you'll get those random black pixels.

Some implementations just leave it at that... Others try to use bruteforce to 'fix' it: Either they increase the supersampling to a level where the black pixels will 'blend in' with the correct neighbours, so the issue is not apparent. Or, they just use double precision floating point everywhere (or worse, if the FPU supports it). Neither gives correct results.

Smart coders can get it working fine without any supersampling, and just single precision. You can use one of these elegant solutions:
1) When you calculate the intersection point, step back along the ray by a certain epsilon value. With a well-chosen epsilon, the intersection point is now always on the correct side of the surface, you never 'overshoot' inside the object.
2) Keep track of what object your ray last bounced from. If it is the same object as the nearest intersection at the current bounce, then it is suspect. In that case, if the length of the ray (distance between the previous and current intersection point) is shorter than a given epsilon, discard this nearest intersection and take the next-nearest one instead, because you have likely bounced against the same surface again.

RobertLM78
August 6th, 2017, 07:40 PM
It's been mentioned a few times, but I discovered FreeBASIC a couple years ago. Sweet language there - and compiled too. It is far from a toy and probably the most capable modern dialect of BASIC I've come across.

GeoffB17
August 21st, 2017, 05:01 AM
Yes, FreeBasic.

A massive project. Very interesting.

The base of the system is the old MS QuickBasic, and at one level, the language follows the syntax of QB. You can use the system as supplied as a console (command prompt) system just like QB.

But, there are a PILE of libraries included, all based around gcc (i.e. 'C') which underlies the compilation process, that can be easily invoked. Plus, you can operate as windows, generating full windows applications.

I've used the system to create a couple of small windows utilities, using my own simple code plus some windows features (like a full windows looking file-list browse/open process), also done a number of progs to provide access to MySQL databases (can do same for SQL if required) - these latter staying as console mode.

Fair number of demo progs to follow/copy/tweak. Compile/link options set to run from Batch file, great list of options, many of which I do NOT understand, so I've used just a few, but it worked fine, pretty much as installed 'out-of-the-box' (which nowadays should be 'as-downloaded').

Geoff

RobertLM78
April 9th, 2018, 06:31 PM
Yes, FreeBasic.

A massive project. Very interesting.

The base of the system is the old MS QuickBasic, and at one level, the language follows the syntax of QB. You can use the system as supplied as a console (command prompt) system just like QB.

Geoff

I would add that coding FreeBASIC in a QB fashion (like using the -lang qb option) doesn't take advantage of the language's full ability. It's better off to just adapt your existing QB code to a more FreeBASIC style.