• Please review our updated Terms and Rules here

Decided to start learnig C

ibmapc

Veteran Member
Joined
Apr 5, 2010
Messages
1,046
Location
Albany, OR USA
I've played around with BASIC since about 1979 but never really got super deep into it. A few years ago, I bought a book about assembly, but couldn't stay awake reading it. So, late in September, I started looking for Online Courses for programming. I stumbled accross "C Programming with Linux" from Dartmouth College. I'm now three courses into a seven course program and learning a LOT! Has anyone here checked this out or completed all seven courses? I'd love to hear feedback and suggestions as to were to go after completion. C++? Python? Java? Other C courses?

If you click the picture, you'll see my progress so far.

Capture.jpg
 
Good luck and have fun with learning C! I have started several times but I am so used to Pascal that every time there is a need for programming something, I have to fall back to Pascal again because I'm not familiar enough with C.
But there is also a bit of annoyance: Borland C is not GCC C is not Microsoft C etc. etc. Broland Pascal is Free Pascal and that is good enough for me.
 
Hi, ibmapc. I'm glad you are learning (and I wish enjoying) learning C. As Ruud said, GCC has many differences with Borland or Microsoft C for DOS, although the basics are the same because Borland C is ANSI compliant (although old standard), but it has many incompatible extensions aimed for DOS programming.

So where to go next depends on your programming target: if you wish to do DOS applications, the basics will be the same you learnt for Linux (file handilng, text printing, logical operations, types, structures and such are part of the ANSI standard since decades ago) but you would need to learn some DOS specific functions, some of them slightly different depending on the compiler you choose, such as Microsoft, Turbo/Borland or Open Wattcom.

So if you want to expand your C knowledge applying it to DOS programming (if that appeals to you), I would read something as Herbert Schidt books on Turbo C, for example. Also must be there some tutorials over the Internet but I can't tell as I don't know them.

If, on the other side, you want to learn to program modern 32/64 bit console or Windows applications, or even web applications, with a language very close to C/C++, but easier to implement and code than Visual C++ on this platform, I would suggest learning C# with the help of the .Net Framework.

Of course Python, Java and Pascal are excellent too but I'm not as familiarized on them as I'm on C, that's the reason I did not talk about them before.
 
I learned C, am still learning C, using texts like C How to Program. I want to say there are better ones. But I've read 2 different editions over the years. It's just the one I stuck with.

There doesn't seem to be many if any good texts on advanced C topics. One I also like is an O'Reilly book, with a sea horse on the cover IIRC, 'Algorithms in C' or something like that. Actually deals with issues like encrypting data and such.

Online courses can be good I suppose. But you need to work that knowledge. Otherwise you'll just forget a lot of it. Trust me.

Another book I used to have was Crafting C Tools for the IBM PC. Not sure if that was ansi based, M$ based, or wot. That's the world we live in. You have to be able to use more then one variant.

A knowledge of C is not optional and the knowing will serve you well. It's the basis for so many other languages besides - Java, Javascript, PHP, etc. C++ of course. You don't have to limit yourself to C. But as I've already said it's really a foundational skill in computer science.
 
K&R C is the original document from which most of the old guys became acquainted with the language. I've linked to a somewhat later online edition that might serve better to get your feet wet; for differences from the first edition, see Appendix C at the end.

In many respects, it's quite different from C99 and light years away from C++. Like all computer languages, it seems, it was a very simple language; but it's grown like Topsy in the intervening years. Consider original 1954 FORTRAN and compare it with, say, F90. C is much like that, particularly when considered in the light of C++.

I don't know if any of the old Usenet C forum content presided over by Ritchie is still around, but I found it to be interesting reading back in the day.
 
Last edited:
I've played around with BASIC since about 1979 but never really got super deep into it. A few years ago, I bought a book about assembly, but couldn't stay awake reading it. So, late in September, I started looking for Online Courses for programming. I stumbled accross "C Programming with Linux" from Dartmouth College. I'm now three courses into a seven course program and learning a LOT! Has anyone here checked this out or completed all seven courses? I'd love to hear feedback and suggestions as to were to go after completion. C++? Python? Java? Other C courses?

If you click the picture, you'll see my progress so far.

View attachment 64547

Last year I bought a C++ book from Amazon and it came from Walmart. It was supposed to have the CD/DVD with it but I never received it. Still going round and round with Amazon and I don't think I'm going to win.
 
On what machine are you targeting your programs for?
If it's for an early vintage computer, then stay with C. However, early K&R compilers didn't have very good debuggers. You really need a good debugger to find and correct errors in your source code. Otherwise you're level of frustration will be high.
I would recommend C++. K&R C had problems with pointers and other areas that were addressed in C++.

There are several free C++ compilers that run on modern machines. Couple that with the enormous tutorials on YouTube, your experience will be more rewarding. and less frustrating.

For older P.C. compatables, Turbo C++ works fine. Not as many YouTube tutorials though.

As far as Assembly language goes. Heathkit made the ET3400 and ET1000 computers that had self teaching courses. Assembly will give you knowledge on what happens at the "primitive" level of programming. Great for interfacing with circuitry.
IT is very tedious to learn. Just keep in mind if you want to learn interfacing with hardware, assembly language is the way to go.
 
I'd love to hear feedback and suggestions as to were to go after completion. C++? Python? Java? Other C courses?
Learn some data structures, for sure. Linked lists, trees, maps. Those will take you far, in any language. With C you'll know them at a primitive level.

You should go write a bunch of C programs next, that's what you should do.

I can't speak to the classes, but you can't learn C, or any language really, in 7 classes.

You need to apply the language to things that interest you and get things done, and it's best done without a safety net.

You need to go and write a bunch of stuff that maybe in the end you're not happy with so that you can improve later. It software, you can change it.

Simply, it takes practice, a lot of practice, to learn these environments and be productive with them.

I also suggest to anyone learning a language, that while learning, you type in all of your code. Don't cut and paste it from the web, type it in. If practical, don't download it from the web, but type it in. Your brain processes it much differently character by character than just drag and dropping large blocks of code that you know nothing about.

Typing it in lets you asses what you're typing. Determine "Oh I see what they're doing here" vs "WTH is going on here?". You also have a better chance when you're in some section of "I don't know" to have it make sense later when the rest of the code is in place. When the "Aha" hits.

There is a LOT of code in the world and on the internet, but there's still a lot of room to find your own voice, write your own code, make your own mistakes, and fix them. There's a bunch of alternatives out there, don't chase them. Best to work with C and find its warts and problems and such as they apply to you before you go run out trying to find solutions because of what someone else says.

I'm of the opinion that an expert in anything is not someone who knows something, it's someone who knows how to fix something. There's a zillion books on how to tile your bathroom, and almost none of them really help you when your floor isn't level, the wall is out of plumb, or something doesn't set right. To fix things, you have to break things, and to break things, you need to make things. So, go make some stuff, break it, fix it, and make it again.

It's always good to learn other languages, but it's better to be fluent in one before venturing out. Only then will you be able to better appreciate the new language, and, perhaps, the old one as well.
 
I use C for any project where I really need performance (except on 8-bit machines where I just write it in assembly), or on systems where I can't count on any runtime being there other than the standard C library, which is everywhere.

But its string handling sucks, so I tend to use Perl for everything else (except on 8-bit machines where BASIC suffices).
 
On what machine are you targeting your programs for?
If it's for an early vintage computer, then stay with C. However, early K&R compilers didn't have very good debuggers. You really need a good debugger to find and correct errors in your source code. Otherwise you're level of frustration will be high.
I would recommend C++. K&R C had problems with pointers and other areas that were addressed in C++.

There are several free C++ compilers that run on modern machines. Couple that with the enormous tutorials on YouTube, your experience will be more rewarding. and less frustrating.

For older P.C. compatables, Turbo C++ works fine. Not as many YouTube tutorials though.

As far as Assembly language goes. Heathkit made the ET3400 and ET1000 computers that had self teaching courses. Assembly will give you knowledge on what happens at the "primitive" level of programming. Great for interfacing with circuitry.
IT is very tedious to learn. Just keep in mind if you want to learn interfacing with hardware, assembly language is the way to go.

For me proly something that runs at least XP? Never gave it much thought. I'm fair with BASIC/Visual BASIC but haven't used those in a while. The last time I did anything productive was before I retired back in '07. I wrote and compiled a program that would figure dB losses and gains with various tower based fixed positioned RF transceivers/duplexers with respect to antenna type, radiation pattern, elevation, and local area topography.
 
You should go write a bunch of C programs next, that's what you should do.

I can't speak to the classes, but you can't learn C, or any language really, in 7 classes.

You need to apply the language to things that interest you and get things done, and it's best done without a safety net.

You need to go and write a bunch of stuff that maybe in the end you're not happy with so that you can improve later. It software, you can change it.

Simply, it takes practice, a lot of practice, to learn these environments and be productive with them.
This, exactly. A good quick-overview tutorial is great for getting you started, but the best thing you can do once you have the basics down is start applying them to a practical problem. Nothing builds experience like actually doing something for real.

Specifically, I'd say look for something you can accomplish fairly simply (i.e. some basic command-line utility you can do with just stdio functionality) that aligns with interests you already have and understand well. In my case, I once wrote a de-duplicator for DX7 patch libraries, because that was something I needed anyway, and all the documentation on the formats was easily obtained. Taught me a bunch about designing and coding a C program.

(Really need to give it a rewrite, actually...)
 
If you're programming medium-to-high end microcontrollers, C is pretty much a given, as development suites and sample code are universally written in it. Low-end MCUs are generally best done in assembly, as speed and memory are minimal and you may be counting cycles for timing. I'm aware of Python being used with some MCUs, but that seems to be a fringe movement. Truth be told, I'd almost prefer Ada--but that hasn't caught on in MCU work and probably never will.
 
Back in the day ADA was the thing at the Pentagon. Might be still around in some areas.
 
This, exactly. A good quick-overview tutorial is great for getting you started, but the best thing you can do once you have the basics down is start applying them to a practical problem. Nothing builds experience like actually doing something for real.

Specifically, I'd say look for something you can accomplish fairly simply (i.e. some basic command-line utility you can do with just stdio functionality) that aligns with interests you already have and understand well. In my case, I once wrote a de-duplicator for DX7 patch libraries, because that was something I needed anyway, and all the documentation on the formats was easily obtained. Taught me a bunch about designing and coding a C program.

(Really need to give it a rewrite, actually...)

Finding something interesting to write is the key to it all. What you want is to build kitchen cabinets, not find a use for the table saw. The languages are just tools, the craft is in applying them, knowing that most any tool will work.

Back in the day, it was games. I was always writing games. Really bad, half finished games. As with most things of this nature, I wrote the game until I solved the most interesting problems, and then moved on to something else.

I'll never forget trying to write a Space Invaders game on the TRS-80. And a problem I had was how to determine which alien got hit. The aliens were made up of the coarse pixel graphics of the TRS-80. But the solution I came up with was I put a unique character in the middle of each alien. When the missile hit, I "looked around" for that character, and used that as an index to tell me what alien I hit. As you can imagine, each alien looked odd and bad on the screen. There are all sorts of better ways to do this, but none had occurred to me at the time. Much of this stuff just happens "in the moment", and you work it, and move on. It's a BAD solution, but away I went with it. And it's instructive.

So, the take away? Write bad software. Write lots of bad software, so you can become like Edison: "‘Results! Why, man, I have gotten a lot of results! I know several thousand things that won’t work."

Write to the problem, solve the problem, move on to the next problem. If you have to return to code that no longer solves the problem, change it. It's software, it's not cast in stone. Outside of deleting production data from a database with no backup, or having a robot arm spin around and hit you in the head, 99.99999% of the time, you can't hurt anything. Try to stay away from code that drives radioactive sources in your first few programs.

What makes things so much more difficult today, however, the internet is gorged with distractions. Because the internet is the Home Depot Tool Wall of options to get things done, where the focus and discussion is on the tools, not the problem at hand.

You will be able to do whatever you want in C. Treat it as your hammer for the next 6-12 months of doing real work, then come back and with eyes open as to what you might want in another language. All of that hard fought experience will carry forward.

And, especially today, write everything yourself and avoid libraries if possible. Write your own data structures. Sort your own data. Read and write your own files. Copy and adapt algorithms, for sure, don't reinvent Quicksort, but write the code for your data, for your application.
 
I just looked it up and yes..........as I thought....
even Python interpreter is written in "C"....

"C" is "C" and there is noooooo substitute.....(or......... what do you think?)

ziloo :mrgreen:
 
I just looked it up and yes..........as I thought....
even Python interpreter is written in "C"....

"C" is "C" and there is noooooo substitute.....(or......... what do you think?)

ziloo :mrgreen:

Python is just a collection of other popular languages put in to create Python.

I would have said there were no substitute for Forth! :D
 
Back in the day ADA was the thing at the Pentagon. Might be still around in some areas.

Still is, if you're into mission-critical applications. DoD dropped the Ada mandate in 1997, but there's still a lot of stuff out there written in it; for example, the ISS flight software.

The problem with Ada was one of availability to the general programming community. I don't think there was a publicly available (free) compiler available until the late 90s. There are a number of Ada for microprocessor initiatives, including at least one RISC-V flavor, so it's doable. But then, you have the vendor's development suite libraries written in C, so it's not really a great option today.

It's somewhat the same situation as was back in the 1960s with Algol and FORTRAN. Algol is by far the superior language in terms of features and readability, but FORTRAN was "cheap and dirty" and you could get it for just about any platform, including dinosaurs like the IBM 650 (FORTRANSIT). ACM allowed both FORTRAN and Algol in its "Collected Algorithms" periodical to bridge the gap (Europe took more to Algol, while the US was FORTRAN-centric).
 
Last edited:
Back
Top