Microsoft has many flaws. So does Windows. I'm happy to admit these. But don't fucking make up shit that isn't true.
I guess what pisses me off about the whole thing is that it's a deliberate mis-interpretation. The guy deals with Java, so he should know better. For the non-technical in the audience, let me quickly explain the issue:
In the days of yore, everything was written in assembly code, which is about half a baby step away from writing programs in ones and zeros. Nothing was done for you, and you wrote code that ran directly on the hardware (i.e., you wrote exactly the instructions that the processor executed). The problem with this is that a) it's hard, and correspondingly b) dificult to get right.
Then god created C, a compiled language. C allowed you to write programs using more easily understood abstractions like variables, function calls, etc. instead of directly telling the hardware what to do. You wrote a program in C, then you ran it through a compiler, and the compiler generated the actual machine instructions for you. These days, unless you're programming a very small, very weak computer (or you're writing a very particular part of an operating system), you're not writing assembly code unless you're an idiot.
But the problem with C is that you could still do just about anything. With great power comes great responsibility. In C, you can directly access any piece of memory you want to. It's all just bytes. C does give you some abstractions if you want them, but you're generally always free to ignore them and flip whatever bits you want to. This fact is indirectly the cause of the vast majority of security vulnerabilities.
So then, the language gods gave us what are called type-safe languages like Java (from Sun) and C# (from Microsoft, and what I most often program in). In type-safe languages, you can no longer directly access memory. To use a bizarre metaphor, if you are playing around with a walrus, you can't suddenly decide you'd rather interact with it as if it were a BMW M3. It's a walrus, and it will always be a walrus, and if you want an M3, you have to go get it from somewhere else. This turns out to be a really great thing, because no matter how stupid you are, you can't accidentally treat the walrus like an M3 because you got confused as to what you had in front of you. You're inescapably forced to treat a walrus like a walrus, and, for instance, no virus writer can trick you into thinking your walrus is an M3.
So what Gosling is complaining about is the fact that there's a way, in .NET, to flip a switch that turns type-safety off. Mostly for backwards compatibility, it turns out that sometimes you need to be able to ignore type safety. So, Microsoft put a way into .NET to allow you to do that. But by default you can't, and you have to have some pretty strong security rights to be able to flip that switch. An applet downloaded off the web from a mischievous web site creator can't flip it, for instance.
So, if you're retarded, you can, as a developer, flip that switch in something you write. And if you're dumb enough to do that without considering the consequences, yes, it can be a security hole. But the decision to flip that switch has to be very deliberate, and as I said, it's off by default. So, saying that this ability, called the "unsafe flag", is a giant security hole is like saying having keys for locks is a security hole because the keys can be used to unlock the lock and then, well, anyone can march into your building!
To which we reply: yeah, well, but there are some times when you do, actually, want to let people into your building in a carefully controlled way, and if you're dumb enough to leave it unlocked the rest of the time, well, that's your own damn fault and not the lock's, now isn't it? A building that you can never open the doors to is admittedly quite safe, but not terribly useful, don't you think?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment