As a kid, I failed miserably at learning either C or C++. I did learn a bit of Basic, but my skills were relegated to whatever I could pull out of old manuals or glean from random developers in AOL chat rooms.
My issue was that all of the above languages were compiled and I didn’t have a compiler. Actually, I didn’t do any hard-core compiled programming until college when I built an astrophysics simulation in Fortran. Even then, I struggled to understand what was going on.
Still, hard-core engineers look with disdain on interpreted languages and the developers who use them.
Whenever I get an engineer to stop mocking me long enough to ask, I push for an explanation of why exactly they think my languages of choice are inferior. More often than not, it comes down to performance.
In the beginning of a Coursera course on cryptography, I was told by an instructor that I’d need to learn C to keep up. 1 I disagreed, but kept my feelings to myself and just carried on with the course.
For the first programming assignment, our instructor gave us a C program that encrypts a string of text using a modified Vigenere cipher – where each byte of the plain text message is XORed with a byte from a randomly-selected encryption key. Again, he recommended with use C to build a program to crack the cipher, and most of the class got right to doing that.
At one point, our frequency analysis of potential decryption keys gave us about 70 different key combinations. The rest of the students moaned and complained and started testing different combinations by hand. One student tried to automate things and, even with 70 different combinations, complained about how long the tests took to run. Many students began picking keys at random, hoping to find a lucky guess (some, miraculously, did just that).
It returned immediately, having processed 256 possible values for each character in the key. No waiting, and I had the solution sitting right in front of me.