Overkill for many sites
I always recommend using appropriate security and most of the time, you do not need sftp.
The Chromium team has finally done it – File Transfer Protocol (FTP) support is not just deprecated, but stripped from the codebase in the latest stable build of the Chrome browser, version 95. It has been a while coming. A lack of support for encrypted connections in Chrome's FTP implementation, coupled with a general …
You would generally use a file transfer system as part of a tool chain that is used in a process. Starting with a fundamentally unsecured system that can be readily exploited is fairly difficult to fix when security just happens to be required.
Make everything zero-trust. That way you avoid one system failure to become the Maginot line failure.
Simple examples: power distribution systems used FTP and FTP like protocols and then they suddenly became distributed. You can't even change the password because it was appropriate security 50 years ago. Router boxes are another example.
I wonder where FTP would be appropriate vs. SFTP. Just opening up the port is grounds for dismissal. I see small devices use it for firmware upgrade. Unsigned firmware upgrade on the top of it.
And thus it failed. The line was intended to be impassable, or at least very difficult to pass, and it probably would have been pretty good had someone tried to assault it directly. However, because it was possible to bypass it, it ended up not doing what it was designed for, and being effort wasted. Its only benefit was delaying troops by a few days, and it used resources which could have been used in making a more vigorous defense against them.
The German paratroopers also penetrated fort Eben-Emael (sp?). Which was disconcerting for the French and the Belgians because the Germans turned around and used it to control everything the fort was supposed to. For the wrong side.
Without zero trust and the current shift from perimeter based architecture to a mesh system, I can’t even imagine any justification for ftp.
FTP depends on everyone else doing the right thing.
"The line was intended to be impassable"
It was meant to be impassable on the German-French border. Belgium and the Netherlands were neutral, and France was still under the mistaken impression that other countries were playing by the informal rules of war that were basically centuries-old gentlemen's agreements. Hitler (there goes Godwin's law!) just ignored the neutrality and invaded Belgium and the Netherlands to get around the Maginot.
I guess that's enough off topic!
Godwins "Law" isn't actually a law, it's an adage. It reads "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.
And that is all. Mike was just trying to get people to think before making daft comparisons to Hitler and the Nazis, he was not suggesting any mention should somehow automagicaly close down the conversation.
No, no, in the case of the Netherlands he didn't ignore neutrality, he simply made up some stuff about the Netherlands having violated it's own neutrality by cooperating with British intelligence officers and German officers conspiring against his government, thus making his invasion of the Netherlands "totally legitimate" (see the Venlo incident: https://en.m.wikipedia.org/wiki/Venlo_incident. This incident also contributed to the forming of the SOE by Churchill, then later das Englandspiel and the general mistrust of Dutch resistance by SOE, culminating in the failure of operation Market-Garden)
The instance has a dedicated purpose, your payload is itself encrypted and where you’re using both IP restrictions and a form of digest-based authentication. Its also worth keeping in mind that SFTP is not great for security either if we are talking OpenSSH vs. vsftpd where the latter has remained very static in feature set, has been widely audited and thus has always had very few bugs.
Should you need TLS for your old FTP implementation, you always have the option of forcing FTPS in modern ftpds or if you’re forced to use outdated crap, then you can either IPSec your way around it or use stunnel.
"I wonder where FTP would be appropriate vs. SFTP"
Literally any anonymous download from the Internet. There is zero reason to encrypt publically available information, and encryption does not help with file verification. People who understand the purpose of encryption can come up with many scenarios which fit your question, while a significant number of "security professionals" are clueless and talk about zero trust to distract from their lack of understanding.
"Literally any anonymous download from the Internet. There is zero reason to encrypt publically available information,"
There are several. Here are a few of them:
1. "encryption does not help with file verification.": It does prevent a listener from modifying the content and it still being valid. Most edits will break the encryption and alert the user rather than corrupting the file, and even if they can corrupt the file, they are unlikely to be able to inject new data into it.
2. Privacy: If I'm downloading a file which anyone can download, but it's encrypted, then an attacker doesn't know exactly which file I'm viewing. This may be of interest to me. The degree of privacy still depends on other factors, as they can usually get the domain I'm downloading from, but there are some plans to encrypt that as well.
3. It prevents meaningful injection of other data, which means that, for files which can't be verified (never an SHA1 hash for a standard page), you're not getting an attacker's replacement instead.
4. If you do have a verifiable file, but the hashes are also retrieved unencrypted, the attacker could replace the file you're downloading and the hashes when you retrieve them so they do match.
If you need something which can talk to something old or something that really can't do encryption because it's so weak, FTP is tested as a protocol. Otherwise, there are reasons to want something that protects the user.
You're confusing your use-cases with every use case. None of your points make any sense with 99% of FTP traffic on the Internet, and your ultra paranoia that somehow there's a man in the middle attack injecting dogs into your cat pictures is ridiculous security industry bullshit.
If someone has the skill, and most importantly motivation to hijack one of the routers between an Internet server and an end user then it's pretty trivial to also insert their own TLS without that user noticing, making the extra layer pointless.
You are sort of correct that almost all FTP traffic in use doesn't come under the use case I suggest. However, that's most of the traffic using FTP for transfers between known machines for known purposes. Most traffic from browsers is users downloading stuff, where the risk of an attacker is larger. Since this article was about the inclusion of FTP in a browser, not an FTP client, I was talking about that use case most of all.
"If someone has the skill, and most importantly motivation to hijack one of the routers between an Internet server and an end user then it's pretty trivial to also insert their own TLS without that user noticing, making the extra layer pointless."
People do have the skill and motivation, observed in ISPs and dodgy public networks alike. And no, it's not always easy to inject TLS. TLS certs are verified against CAs and associated with specific names. Unless the attacker succeeds in redirecting the user to a different endpoint without their noticing, they will find impersonation a bit harder.
"Most traffic from browsers is users downloading stuff, where the risk of an attacker is larger."
No, it's not. It's just not. If you don't work for MI5 it's extremely unlikely that someone would do anything to your anonymous download from Tucows that hadn't already been done server side or client side. There is a vanishingly small possibility that someone would be able to intercept your traffic and modify it unless there were state level reasons to do so, and even then they'd probably have the help of the infrastructure providers.
I strongly suggest you stay away from security vendor marketing for a while, it's having a detrimental effect on your world view.
"the attacker could replace the file you're downloading and the hashes when you retrieve them so they do match"
What a palaver. Alternatively call up claiming to be XYZ Technical Support and, for far too many people, get a lot more useful information for a lot less effort.
Entirely true. Few attackers would go to that effort when they have other mechanisms. The only reason I brought it up is that it is a case where the data itself isn't sensitive but can still produce a dangerous result, and you'll note that it only is needed if two conditions which don't always hold are met, and if either is not met, it's a lot easier to inject malicious data.
There is a case though where FTP makes sense: retrieving my own encrypted files stored in "the cloud".
I use encfs and the files are encrypted locally before storing. So, files at rest "on the cloud" are encrypted with my own key that is kept locally.
1. encfs is setup with a verification code so that if the file was changed the read would fail.
2. Indeed an attacker can see that I access my cloud provider and how much data. Encryption won't change that. To be precise, the FTP control flow is kept encrypted of course, only the FTP data flow in unencrypted. (--ftp-ssl-control option for curl)
3. see 1.
4. no, the hashes are in the encfs blocks themselves, and apart out of sheer luck you can't tamper with that.
Adding encryption on already encrypted data adds no protection, indeed granted that the protocol used also does "checksums".
Yes, that use case works well. My comment was about the uses from a browser as that's what changed and what some here dislike, and you can't do any of those checks from a browser. You could of course download the file with a browser and see whether encfs likes it, but I'm guessing you're using an automatic system which does it more efficiently and therefore don't rely on the browser for any FTP tasks.
Of course they would - it's proprietary and capable of silent snooping.
SFTP is the fastest and simpest way to maintain a web site - far more convenient than the clunky file managers in abortions such as cPanel.
"SFTP is almost nothing like FTP and was never supported by Chrome or Firefox."
Yes. As the article stated "A lack of support for encrypted connections in Chrome's FTP implementation ..."
So rather than implement the secure protocol they abandon the facility entirely in favour of a certainly proprietary and almost certainly opaque tool.
"So rather than implement the secure protocol they abandon the facility entirely in favour of a certainly proprietary and almost certainly opaque tool."
You're telling me that you can't find a single open source SFTP client? I can. Lots of them. CLI or GUI. Linux, Windows, Mac OS, all included. They're not new either. Most seem to support unencrypted FTP if you need that still. They're not proprietary. They're not opaque as the standards are well defined. And, unlike browsers, they support uploads as well. Use them.
"has anyone actually seen FTPS in the wild?"
Yes, usually exclusive of using sftp, and my first question is "Why the hell don't you support sftp?" I guess one argument in favor of ftp/s instead of sftp is that sftp by default uses port 22 (ssh), which might not be desirable.
I have and it was a pain to deploy and a pain to configure the clients, too. For whatever reason, FTP sessions can start in plaintext and then switch over to TLS mid-connection. It seems to break the whole purpose of TLS and I despised maintaining and supporting that mess.
SFTP is so much easier to deploy.
That question is an embarrasingly small one, smartass, and the answer is yes. Us who actually do this IT-shit for a living still rely on the ability to transfer files between boxes in a secure manner. So we do it with both implicit and explicit FTPS in an opportunistic manner.
Yes, when you don't want to give SSH credentials to people needing to access a machine just to transfer files to/from some directories only. It could be simpler than having to neuter they SSH accounts to allow only some file transfers.
>"The big question is, has anyone actually seen FTPS in the wild?"
This question really needs to be caveated with:
In a use case that required the usage of a web browser rather than a functionally specific piece of software that can be more easily sandboxed by the OS
About the only time I worry about file transfer protocols is when I have to reset a bricked appliance and so am using the factory reset TELNET/TFTP interface to load a clean firmware image.
I agree with you about sftp (and also scp and ssh when available). cPanel (and any equivalent) is one of those necessary evils for things like DNS or aliases or e-mail or otherwise managing services. But actual file management? No. Just No.
I smell a new 0lugin for things like ftp in firefox. There WAS one for gopher but last I checked it is no more. Maybe a "legacy protocol" plugin for educational purposes at the very least...
I hope KDE don't follow suit with kio-slaves". I have an ancient but still functional Buffallo NAS device on my home LAN. My Brother all in one device saves scans to it. Older versions of KDE would let me open that via SMB but the device was SMB2 and support for that was dropped. However FTP access is still an option. I could add the functionality elsewhere but it would be another make-work to replace something that is good enough in its context. If someone can eavesdrop on my home LAN I have bigger problems than sending plain text passwords to see my scans.
* Are they still allowed to call them that?
A certificate doesn't authenticate the source, it just means that someone bought a certificate that covers their server from a supplier on your trusted list who wanted money. 99% of FTP use cases don't involve a server the end user knows the name of, and increasingly people download blindly from amazon arbitrary named endpoints which have certificates but which you couldn't determine the owner of.
SSL certificates are blind trust for authentication, their only purpose in reality is to encrypt traffic for privacy. If you don't need privacy they achieve nothing at all.
As a software developer who had multiple titles available via anonymous FTP in the early years of FOSS, I can assure you that almost nobody reads the README file ... or any other text file distributed with the code. Gawd/ess knows why the kiddies were so excited to get access to the source. I remember one friend included the comment "If you read this, you've won $100! Contact the author directly to be sent a check." in a header file ... he got no takers for the ten years that he owned that email address.
Plenty of other choices for when you need to use it and on the plus side, less clutter in the browser.
Perhaps it might even reduce the amount of RAM that Chrome seems to want to suck ... well, one can hope.
Alternative FTP options include
Linux command line ftp, and wget can do ftp too (but curl can't)
Windows command line but no PASV iirc ??
and a million other possibilities
When it comes to the thousands of public FTP servers out there that serve up everything from archived documents to publicly available software, it's always been convenient to just download these straight from the browser rather having to whip out the ol' FTP client.
FTP has a massive benefit over HTTP here in that it is actually stateful, with a persistent connection and resuming downloads simply bloody works, instead of HTTP(S)'s adorable feature of having you start over a 1 GB download from scratch at 97% because your connection had a bit of a wobble.
With the move to eliminate HTTP as well, I imagine that before long anything that isn't HTTPS (or HTTPA?) will be cordoned off except for those of us who still know our way around archaic tools that speak these obscure protocols.
if http disappears then a LOT of embedded devices may have trouble with web interfaces.
It is often VERY hard to implement SSL within the confines of a tiny CPU's NVRAM code space, even for ARM devices. Eliminating http support would add unnecessary cost factors for embedded devices, or require you to use "the cloud" or some proprietary protocol/app.
I have toyed with even an Arduino (with a network or wifi shield) serving up config pages. I even added graphics to it. Limited to ~30k of code space there's only so much you can do, but for a simple device it can "just work", maybe just to config the network interface, even.
using https for that - just NOT possible.
(and it becomes a road block for independent inventors and software developers)
[and do you REALLY want to be FORCED to get an SSL cert JUST to have a private web server?]
For embedded devices, please do not forget TFTP.
I successfully implemented parts of UDP protocol code in C inside a PIC16Fxxxx from Microchip without much trouble, even with limited stack and flash space.
Also, tftp is still available in both Windows / Linux / BSD by default.
@Smartypantz not every connection needs to be secure or authenticated. For FTP transfer of things in the public domain there is no justification for either since you're not trying to hide the data from anyone (it's already available) and you don't care who downloads it (it's freely available).
The level of security understanding on these comments is surprisingly poor for 2021.
Exactly. Chrome's HTTPS implementation is completely unusable in a lot of circumstances.
How do I access my private NAS over a private, airgapped LAN?
Self-signed certificates are the most secure method available, yet none of the major browsers support these in a useful way.
Firefox moans but lets me bypass - but not to say "this is the right cert, store it and never warn me about it ever again, but DO warn me if it changes"
Last time I tried, Chrome just refused entirely.
Yes, I could manually add the cert to the OS store, but as so many tools seem to ignore that...
That whole blindly type "thisisunsafe" into the browser thing for self signed certs is the most fucking infuriating thing ever. No. No it's not. It is not unsafe. I am debugging a webserver! It is MY webserver! This is about the safest connection I ever will make; seeing as it's going to fucking localhost!
I understand your frustration with self-signed certificates in the majority of internet browsers being so difficult to accept their use. But there is a clear explanation for that - if it was easy to users to accept self-signed certificates, malware authors targeting banks could take advantage of that also. It is still easy to modify the process of an internet browser to get bank credentials purely in RING3/usermode, but more debug/decompiling and patching in memory are necessary.
or require you to use "the cloud"
It would equally add cost to securely communicate with "the cloud", but of course it's cost you can recover through monetization, whether that be subscription fees or managing the lifetime of connected hardware to ensure continued upgrades.
When it comes to security, security of revenue will trump all other considerations.
It's also quite convenient to use wget from the command line to download a file instead of firing up megabytes of browser. Not everything HAS to be in the browser.
And it doesn't try to secondguess your URL and convert a reference to a local machine's filestore into some global link (which, of course, fails).
Yesterday, because i do this for a living and live in the real world.
Let me guess: You identify as a "developer"?
We cant all just sit around scratch our asses/navels and "use git" to load the tons off bullshit you need to "develop" the next , agile version of "hello world".
Some of has to get shit to work IRL.
I'd use command line FTP almost daily doing my job, but that's almost always on the corporate WAN and not over the Internet. I never use git, just because you don't use something doesn't mean that others don't nor there still isn't a place for it. It doesn't really bother me that Chrome killed FTP support either though, I did use it from time to time but most public mirror FTP servers that I'd use it for also support http/https.
>the protocol is over 50 years old and comes from more innocent times, when authentication was not what it is today
Those days I didn't trust anyone on the street. I didn't trust stuff written in the Guarniad either, though the Times (pre-tits-days) was, of course, a source. Just that.
That was the time of Thatcher, a person without redemption, whose publicist BoJo perhaps changed the world for the worst [sic].
These days I'm almost like you guys; I almost trust no-one. Except FTP requires I trust myself to evaluate the stuff I receive without, in any way, trusting it.
This is, of course, Irony if you don't understand what I am saying, otherwise it is Sarcasm and this post should be immediately deleted lest innocents learn.
Being authenticated by SOMEONE ELSES SERVER should not make you trust that person. If they don't give a shit who you are, why would they authenticate you? What if there are a million downloads, will you be creating the accounts for authentication?
Think McFly, think!
I got rid of ftp years ago. Then I got a multi-function printer that used it to upload scans, so I found myself reinstalling an ftp server, strictly internal only of course.
The last time I used it on the internet was for uploading documents to a printing company. Anonymous ftp, just drop it in the right directory. I could also browse projects from their other clients. They didn't seem concerned by the security implications.
I still pull files off read-only FTP sites occasionally. Mostly drivers/shims for ancient hardware, zipped or tarred files containing technical documentation, photos of old boards and wring, and that kind of thing.
My FTP server has family photos and the like uploaded by various family members to their own space. Other family members can browse the individual collections if they are logged onto the server. Security implications are minimal. We did have one teenager park some Pr0n on his space ... he's lucky I found it before his grandmother did.
FTP moves files around internally in my private USENET system, when I'm not using UUCP.
FTP in the browser? Never really used it.
FTP with specific clients and applications? Useful.
For example I have cameras that can upload photos while shooting via FTP (via Ethernet or WiFi). The advantage is the setup is simpler and faster than setting up a web server to perform the same task. Not something you may wish to use across the Internet (without a site-to-site VPN ), but very useful locally, even if it means uploading to a laptop.
Latest models can also transfer via HTTP(S) - but they do using a gateway managed by the camera maker - that means the camera uploads there, then you download from there. That avoids them to have to deliver the software to be installed on customers' web servers to manage the transfer - and having to test if for different web servers and operating systems. You get the download utility, which is simpler to write. But sometimes you can't or don't want to go through an outside system.
Maybe they could have used WebDAV - but for some reasons it's a protocol that never became really commonplace.
Some are listed here:
Presumably somebody with an interest in such things has a more complete list posted somewhere online, but a casual DDG on the subject doesn't turn up anything useful.
Internet FTP, ie. FTP over TCP, was first defined in RFC765, published June 1980. The first RFC describing a file transfer protocol was RFC114, which was indeed about 50 years ago. However that was an ARPANET protocol and really quite different to the protocol described in RFC765.
Me 107 year old Great Aunt, who has been using Gopher to publish her life's story for decades, thinks that it is evil to remove perfectly good tools from the hands of people who don't want to learn a new system TO DO THE EXACT SAME THING. Especially when those tools are, to all intents and purposes, finished, and thus require minimal upkeep.
Fortunately, she has me to run the server and keep her old tools running.
I may be a bit out of date but I'm pretty sure that the same basic protocols are used by both FTP and HTTP (and by extension its variants). They both use TCP as the underlying transport, framing the transactions using a three decimal digit code to manage the request/authentication/transfer protocols. The only difference is that originally HTTP used a simple encoding scheme to transfer binary data into 'readable' data.
What I think we're seeing here is what we could call 'technology creep'. As sucessive generations of programmers work on the software they layer new protocols o top of the originals, ultimately forgetting what the underlying mechanisms are and how they worked. This is the reason why generations of programmers have built message oriented transactions on top of TCP -- they haven't a clue what it is or how it works but know its 'reliable', all it is to them is a parameter they use when they open a socket. (If they thought about it a bit they should be basing the stuff on UDP but, "whatever", they're not to know about the decrease in reliability and the extra traffic and complexity caused by using TCP. Then there's the constant reinvention of the wheel -- never use FTP (or even SFTP) when you can write some code to practically the same thing (I don't know what any knew mechanism would be like but its probably built around numbered blocks and CRCs -- there really aren't a whole lot of ways to move data from 'A' to 'B').
Its unimportant to me. I don't use FTP inside a browser. I want to see what I'm doing.
In 1991 someone in the Math Dept. grabbed my elbow and pulled me to a DECstation and wanted me to show the Internet. He said type this in : ftp ftp.funet.fi I said what does this do ? It connects to a server in Finland so can you download all their files. At that moment i decided to reduce my PhD efforts and concentrate on all aspects Internet. Did ftp.funet.fi have problems with ftp ? No of course not. Why does Google have problems with ftp today ? Oh wait its Google who has problems with the entire world now. Obviously today Google is China controlled, so who is Google to snuff out ftp ?
Recently the Big Browsers like google chrome and now also opera have switched ftp off by default, labeling it a "dangerous" experiment. If you know the dangers you can still use ftp:// from your browers . Here's how to enable old-style ftp inside opera https://crashrecovery.org/opera-ftp/ As of opera-stable_78.0.4093.147 the Big Browsers came to the conclusion that the ftp:// experiment had become to dangerous. If you know the dangers you can still obtain the older versions and are available here https://crashrecovery.org/opera/DEBS/