A programmer's view
After over 30 years in this field, you have to love some of the oddities that go on in it. Back in the 80's it was the technology bigots who could find dozens of reasons to explain why all other computers that they didn't own and had never used, were worse than theirs that they only played games on anyway. The old war between Commodore Vic 20's and Radio Shack "Trash-80's" (TRS-80 was its real name). This is what got me into Atari machines, a need to actually understand the other guy. In the end I replicated all of the Atari & Commodore functionality on my "Trash-80" and I did it in a modem based BBS for all the world to see. I can't tell you how many Commodore users lost their minds when they discovered the BBS they were falling in love with had actually tricked them into believing it was the same machine as theirs! At the end of the day, that debate was more about those who knew the inner workings and the tech wanna-bes who just wanted to sound like they knew.
Through the late 80's it became language bigotry! The BASIC programming language was for Beginners. Only 4th generation languages like DBase would stand the test of time! Or so they thought, so again I learned what the tech wanna-bes were making all the fuss about. DBase and Clipper was a really great tool for building database applications because it was simpler to work with especially for large systems and if you had enough personal drive to want more or to do it better, you could push the language to some pretty cool areas. At the end of the day it was just a database language and was of little use for anything else. C++ was also gaining momentum at the time but I took issue with it. I already knew Assembler and the performance gains in assembler verses C++ were substantial and contrary to popular belief, object oriented coding doesn't make more things possible, it just requires less thought to produce the same results. Remember that which most of the world forgets, EVERYTHING a computer does, happens in machine code regardless of what you wrote it in. The other issue I had with C++ was that to me, C++ wasn't a complete language. Of course technically speaking it was complete but to work with anything you always needed someone else's SDK and in the Windows environment you needed an SDK for just about anything. This meant that even if you knew the core C++ language inside and out, you were functionally useless until you learned all of the SDKs too. I did break down eventually and learned C++ as well as MFC and did some ISTL but the pressure was on to learn Java so I got to work in that.
By 2001 the world was crazy over Java so I learned that. The core language was nothing more than a C++ variant with custom classes. Any desktop work you did with it didn't look or work the same as any other Windows application, developer GUIs for the language were massively expensive and some of the GUI class implementations were often complicated and convoluted beyond reason. You ended up doing more work to put a text box on the screen in Java than any other language. Add to all of this it was an interpreter language which would have highly questionable results if you didn't have control of the runtime install on everyone's machines. Companies were nowhere near the level of installation management they needed to be so it became well known that anything with Java in it would have questionable results. Finally, there was nothing that could be done in Java that I couldn't already do in Lotus Script or C++ so my Java knowledge died of a complete lack of need and it wasn't until late 2009 that I got back into it.
Learning to generate TEXT
After 3 years of courses at Ryerson studying the Java language and J2EE, I had finally learned to do exactly the same thing in Java that I had already done in all of my previous languages over the years. Use code in one language to generate code for something else. Today we call it code injection and it's one of those really cool words we toss around but simply translated it means using some data input source to compute a decision about how to generate output data using program code. ANY language that can generate a line of text can do it.
In the 80's I examined database data to decide how to format text into a printed report. Back then there were no printer drivers so we also had to generate the commands to turn on and off printer features like BOLD, Italic and underlining. I even had to reformat the output for the different sizes of paper.
As I type this I'm reminded of the numerous times though the late 80's that I wrote QBASIC code to read DBASE databases to automatically generate common clipper code I used in my search by example routines. It's all the same, using code to generate text. ANY language can do it but we long since lost sight the very basic fact that is just TEXT.
Now I do hear the folks out there who are shocked that their tools have been so horribly simplified and yes, they're right, there are many other things going on like multi-threading, security, N-Tier architectures and SOAP.
N-Tier architecture is an important part of things but I've been doing that since the 80's in various languages too. It's simply breaking programs down to smaller, logical pieces and coding them to be used to perform the same task for any number of other programs or modules rather than hard coding the same module into numerous programs.
Multi-threading has been around since the very early 80's in lower level languages like Assembler and C/C++. It was coveted by the VB programmers so it became part of all languages in the .Net group of languages. Not to be outdone by .Net, Java also got multi-threading features but I've yet to see anyone code for it effectively even though all we're doing these days is generating text. You need to be very strict about your architecture and task compartmentalization before Multi-threading can be effective and most are focused on just making it work.
Security of course has been around forever and numerous movies like"WarGames" have come out over the years to teach us the dangers of ignoring security. I can distinctly recall having to code my BBS's to deal with Octal 9 attacks that would blow an integer input or "String too large" attacks, both of which would crash my BBS to code level. That was back in the 80's. Surprisingly, these days most leave security to built in language processes with no checks. No backend code is used to verify if the referring URI is legitimate or if the user is properly logged in as they move from module to module or page to page. The assumption is that if someone got far enough to execute the code, they must be legitimate. Sadly we already know that's not always true from experience so as much as we like to cry "Security" in discussions and debates, its often quite a different story in practice.
Quality vs ???
There was a time when IT systems were valuated on the amount of money they would save an organization. Even today the savings from this kind of valuation can be staggering. However, today IT resources are seen as a liability. A cost of doing business. The focus is to squeeze that liability as much as possible to reduce costs. The expectation is that quality will remain the same but that simply isn't realistic in most cases. The work you cut corners on now you'll only have to redo later. The end result can be systems that cause more work than the processes they're meant to improve or are so cumbersome as to cause users to avoid them in favor of manual processes. Today we see the results, manual processes growing, manual and mundane activities increasing while the cost benefit of automation is in decline. Business is seeing the savings in IT expenditures but has completely lost sight of the costs associated with those savings. I can actually name a large publically traded GLOBAL company whose head office is in Canada that has removed some of their software systems leaving significant business areas to fend for themselves with hard copy papaerwork stored in binders as part of their day to day operations!
What is a programmer
To me, a programmer is an automation engineer. We look for ways in which to free humans from unworthy, mundane tasks so their intellect can be better applied to more important tasks. We look for ways to reengineer existing processes to release human "work effort" as I call it and we try to balance the required work effort to improve a process, with the work effort saved BY the improvement. We're also charged with the role of keeping an eye out to intrinsic values. Those data elements that can be captured as part of a normal process whose meanings far exceed that of the individual data nodes themselves and speak to important business factors collectively rather than individually. We're experts in context and metaphysics, problem solving and analysis.