Do NOT simplify when you are explaining!

Or at least make it very clear when you do so! Simplifications and metaphors are very dangerous because they hide some detail deemed non-important by the person who is using them. But you can’t know in which context your explanation will be read (if you are posting it on the Internet), so be sure to always include a warning if you use such techniques.

What prompted this post was an article over at ITToolbox which claimed that files can be deleted under Linux while in use, because when linux runs an executable it loads the whole file into memory and accesses it from there. This is incorrect and was pointed out in may of the comments, however the real reason which made me write this posting was the authors defense (posted as a comment):

I wrote this article to be a very simplistic model. I primarily
aimed this at people who have no understanding of the
technicalities of file systems. Before I wrote this I did a
quick google and found no comparable article so I hammered out a
few lines. While I agree that it is not to the letter correct
technically I believe it is correct in a general view.

Now there are two possibilities:

  • The author did not know the exact technical details of the mechanisms and offered his understanding of it – this is perfectly ok, but when the more correct explanation was posted in the comments, he should have updated the post with it. There is no shame in not knowing something, only in trying to hide it!
  • The author did in fact know the technically correct explanation and as he stated, he was trying to offer a simplistic explanation for the not-so-technical people – in this case, as I’ve said earlier, he should have indicated this clearly and maybe include the correct explanation in the post for the people who do want to know. By not doing this he misleads users who are technically savvy but have no experience with Linux into believing that Linux has an inferior swapping mechanism or that it consumes more memory as shown in the following comment:

    I am a Windows user and suddenly started liking Linux flavours on
    its features.
    You say that ‘When linux runs an executable it loads the whole
    file into memory and accesses it from there. This means that
    there is no connection to the physical file on the disk drive.
    When the program is closed and all connections to the file are
    cut the file is deleted from memoy’ will this require a lot of
    memory when compared to a windows machine ??

BTW the correct explanation as written by Bley in the comments:

This is incorrect. Linux will *not* load a program into memory
on startup. It will load pages of the program from the disk
file into memory as needed, and discard them when they’re no
longer needed (or when the kernel needs memory for other
programs); that’s OK, as it can always recover them from the
disk file.

There is another (subtler) issue. Under Unix (and Linux), files
aren’t actually deleted until they’re no longer in use. So if
you have a program that keeps a file open and then you delete
it, you won’t be able to see the file in the directory listing,
but the file is still there; the running program will be able to
continue to see it as if it was still there. The file gets
deleted when it’s no longer in use by any programs. The same
applies to the kernel wrt executables: if you delete an
executable while it’s running, the kernel will defer the actual
deletion of the file until the program exits.

There are still issues that make updating more complicated than
it seems. For example, when updating a package, you need to
update multiple files: executables, libraries, configuration
files, data files, and so on. If the executable in the package
is running, it will continue to see the old version of the files
that it has open at the time of the update. That’s OK; it’s the
old version of the executable that’s running at that time. But
if it needs to open new files, it will open the newly updated
version, and this may cause unexpected results (crashes, etc)
because of version inconsistencies.

So, digg – wisdom of the crowds? Not always…

Update: the author of the article included an update in it which clarifies the technical part.


Leave a Reply

Your email address will not be published. Required fields are marked *