Here are some questions I've been asked in "serious" interviews. Some of them actually had me laughing, some of them saddened me. I can't recall a single interview where I've been challenged, really, or been excited to be there after sitting down.
1. At one interview, I was asked to map out a classical object-oriented design problem in UML. This is the infamous "employee type" problem that a lot of first year C++ students are walked through. The problem, though, is that it doesn't work in the real world. You can see a fairly technical explanation of why it doesn't work [here]. Being presented with this problem - which I was all too familiar with - scared me. It was my first hint that both the project I was working on was doomed, and the people running it were no different than CS students fresh out of class. The real world, and real world programming problems, can't be reduced to the kinds of UML logic they were trying to apply.
2. "How would you get the distance of an object in OpenGL?". Man, that opened a can of worms. OpenGL itself doesn't provide any kind of object management. So "an object in OpenGL" means you're already using something on top of OpenGL like Inventor. But for argument's sake, it's say by object we mean "point". That much OpenGL understands. A point in 3D space is defined by it's X, Y, and Z coordinates, with Z being the depth coordinate. Now by asking for the distance, what they really meant was a vector from the "object" to something else, and the length of that vector. Imagine drawing a line between the object and another object, that's your vector. Since they said "the distance of an object", it's probably reasonable to assume they mean the distance from the observer (which in OpenGL is the view frustrum) to the object. Take the point that's the object, the point that's the observer, and find the length of the vector between them. Easy, once you understand the problem. It turns out that I lost the two interviewers right about at "OpenGL doesn't really have 'objects'...". They had no idea what vector math was, and by the time I finished up with the whiteboard they were staring off into space or looking over the only reading material they had available, my resume. When I said "view frustrum" I swear one of them started scratching himself like I gave him a rash. It was pretty clear that I was being interviewed for a 3D position by people with no 3D experience.
3. "Can we see it?" On at least two occasions, in interviews I've been asked to provide proprietary code or code I was under a non-disclosure agreement (NDA) on. Before I open my mouth about something like that, I usually have a very good idea of what I can talk about and what I can't. There is one contract that I worked on where I wasn't supposed to even mention I was on it at all, the NDA was that restrictive. In other cases, I'm free to talk about any number of aspects of it, but I certainly could never show someone the code I wrote under contract! How would you feel if I showed the source code for your product to another company? Geez. In my book, "can we see proprietary code you've written" is one of the most inappropriate questions you can be asked, if they know it's proprietary. [ 2/28/2004 06:10:29 PM ] [  ]
Monday, February 23, 2004
Recently I've become more aware of just how "old school" my programming experience and techniques are. I have a heavy background in C (and even in assembly), which formed the foundation of most of my programming experience - even though long before I knew any C, I was up to my eyeballs in LISP, Logo, BASIC, and other ancient but high level languages.
Almost all of the programmers I know professionally now have no C experience. To me, Perl, PHP, and HTML are not "real" programming languages. They're scripting languages. Put the average Perl programmer in front of an actual C (or Java) program and they will be hopelessly lost. I have nothing against Perl programmers, I'm just saying that working with a scripting language is very different than working with a compiled language.
Reading [this] brought this back to the forefront of my mind. Some of the author's comments are thoughtful, but some of them stood out to me as being very "new school". Yes, it kinda sucks that C doesn't have it's own String types, but C wasn't really made for working with strings. LISP was, Java was, Perl was. C was not. C++ tried to be everything to everyone, and has String functions up the wazoo, but that doesn't make up for how bad C++ is (after all, C++ is just an extension to C).
If anything, learning C makes you a better programmer. Yes, you can blow up the whole system with a type-o. Sure, you can do very, very bad things. Call it negative reinforcement. A lot of my time working in C was on the classic MacOS. Under MacOS 7.x, you certainly didn't have protected memory and other niceties of a "modern" operating system. When you screwed up, even in fairly innocent ways, you'd lock up the machine. Macsbug was your friend, even if the only shell commands you knew were "rs" and "es". Believe me, I screwed up all over the place. I often said my job was "finding new and interesting ways to hose the machine" because it really was.
Knowing that if you screw up, you're going to spend the next half hour or so rebooting and recovering can make you pay more attention. It really makes you see the wisdom of "measure twice, cut once". I got into the habit of doing things like making sure there was enough free memory before I allocated some. Checking to see if a file was writable before I opened it. Not doing things like making casts, assuming a pointer contained what I thought it did, and other things. I looked for functions to return status or error codes.
Now a few years later I still program that way, though not in C (I do very little C programming these days). In the past few months, on MacOS X and on BeOS I've had to port some C and C++ code over from POSIX, BeOS, etc. to the target platform. Some of the things I see in other people's code really gets me scratching my head (particularly, in C++). There are some people out there who probably have a nice little certificate hanging on their wall that says they got some degree or took a course or whatever, but it's plain from their code that they didn't learn much of anything. Just because the compiler lets you build it doesn't mean it was a good idea.
Anyway, I'm rambling. The point here is that I'm of the very old school - I prefer strongly typed languages where it's very much my responsibility to be aware of what's going on under the hood. In contrast, a lot of the code I see being produced today, and the people writing it, see the machine as black box that does magic for them. Or maybe they just don't know, or care, what's going on under the hood as long as they get their paycheck. [ 2/23/2004 06:03:04 PM ] [  ]