FTP - Untrustworthy? I Don't Think So!
Lately, as if writers all draw from the same shrinking paddling-pool of ideas, I've noticed a batch of stories about how unsafe, unsecure and untrustworthy is FTP.
SC Magazine says so.
First it was an article in the print version of SC Magazine, sadly not repeated online, titled "2 Minutes On... FTP integrity challenged", by Jim Carr. I tried to reach Jim by email, but his bounce message tells me he doesn't work for SC Magazine any more.
This article was full of interesting quotes.
"8,700 FTP server credentials were being used to access and infect more than 2,000 legitimate websites in the US". The article goes on to quote Finjan's director of security research who says they were "most likely hijacked by malware" - since most malware can do keystroke logging for passwords, there's not much can be done at the protocol level to protect against this, so this isn't really an indictment of FTP so much as it is an indication of the value and ubiquity of FTP.
Then we get to a solid criticism of FTP: "The problem with FTP is it transfers data, including authorization credentials, in plain text rather than in encrypted form, says Jeff Debrosse, senior research analyst at security vendor ESET". Okay, that's true - but in much the same vein as saying that the same problems all apply to HTTP.
Towards the end of the article, we return to Finjan's assertion that malware can steal credentials for FTP sites - and as I've mentioned before, malware can get pretty much any user secret, so again, that's not a problem that a protocol such as FTP - or SFTP, HTTP, SSH, SCP, etc - can fix. There's a password or a secret key, and once malware is inside the system, it can get those credentials.
Fortunately, the article closes with a quote from Trent Henry, who says "That means FTP is not the real issue as much as it is a server-protection issue."
OK, But a ZDNet blogger says so, too.
Well, yeah, an article in a recent ZDNet blog entry - on storage, not networking or security (rather like getting security advice from Steve Gibson, a hard-drive expert) - rants on about how his web site got hacked into (through WordPress, not FTP), and as a result, he's taken to heart a suggestion not to use FTP.
Such a non-sequitur just leaves me breathless. So here's my take:
FTP Has Been Secure for Years
But some people have just been too busy, or too devoted to other solutions, to take notice.
FTP first gained secure credentials with the addition of support for SASL and SKey. These are mechanisms for authenticating users without passing a password or password-equivalent (and by "password-equivalent", I'm including schemes where the hash is passed as proof that you have the password - an attacker can simply copy the hash instead of the password). These additional authentication methods give FTP the ability to check identity without jeopardising the security of the identified party. [Of course, prior to this, there were IPsec and SOCKS solutions that work outside of the protocol.]
OK, you might say, but that only protects the authentication - what about the data?
FTP under GSSAPI was defined in RFC 2228, which was published in October 1997 (the earliest draft copy I can find is from March 1995), from a draft developed over the preceding couple of years. What's GSSAPI? As far as anyone really needs to know, it's Kerberos.
This inspired the development of FTP over SSL in 1996, which became FTP over TLS, and which finally became RFC 4217. From 1997 to 2003, those of us in the FTPExt Working Group were wondering why the standard wasn't yet an RFC, as draft after draft were submitted with small changes, and then apparently sat on by the RFC editor - during this time, several compatible FTP clients, servers and proxies were produced that compatibly supported FTP over TLS (and/or SSL).
Why so long from draft to publication?
One theory that was raised is that the IETF were trying to get SSH-based protocols such as SFTP out before FTP over TLS (which has become known as "FTPS", for FTP over SSL).
SFTP was abandoned after draft 13, which was made available in July 2006; RFC 4217 was published in October 2005. So it seems a little unlikely that this is the case.
The more likely theory is simply that the RFC Editor was overworked - the former RFC Editor, Jon Postel, died in 1998, and it's likely that it took some time for the new RFC Editor to sort all the competing drafts out, and give them his attention.
What did the FTPExt Working Group do while waiting?
While we were waiting for the RFC, we all built compatible implementations of the FTP over TLS standard.
One or two of us even tried to implement SFTP, but with the draft mutating rapidly, and internal discussion on the SFTP mailing list indicating that no-one yet knew quite what they wanted SFTP to be when it grew up, it was like nailing the proverbial jelly to a tree. Then the SFTP standardisation process ground to a halt, as everyone lost interest. This is why getting SFTP implementations to interoperate is sometimes so frustrating an experience.
FTPS, however - that was solidly defined, and remains a very compatible protocol with few relevant drawbacks. Sadly, even FTP under GSSAPI turned out to have some reliability issues (the data transfer and the control connection, though over different asynchronous channels, share the same encryption context, which means that the receiver must synchronise the two asynchronous channels exactly as the sender did, or face a loss of connection) - but FTP over TLS remains strong and reliable.
So, why does no-one know about FTPS?
Actually, there's lots of people that do - and many clients and servers, proxies and tunnels, exist in real life implementations. Compatibility issues are few, and generally revolve around how strict servers are about observing the niceties of the secure transaction.
Even a ZDNet blogger or two has come across FTPS, and recommends it, although of course he recommends the wrong server.
WFTPD Pro. Unequivocally. Because I know who wrote it, and I know what went into it. It's all good stuff.