
pardon, me ignorant
I use FTP all the time to upload/download from my server, android. What I am supposed to be using instead?
Mozilla developers have decided to block requests for File Transfer Protocol (FTP) subresources inside web pages. A bug report and Intent to implement notice suggest the change will land in Firefox 61. The browser’s currently at version 59, with 61 due in May 2018. The change will permit access to FTP resources in hyperlinks …
Scp or sftp, i'm sure there's android clients for those. But you can continue to use the legacy and insecure (your password is sent over the wire unencrypted, as are sll your files) ftp, this article only says that websites that pull, let's say, their images in over ftp (instead of http/https which is used by 99.9999999999999999999999999% of the sites) are going to break. This is independent of whether those images have been uploaded over ftp or not.
"The change will permit access to FTP resources in hyperlinks or when an FTP server’s address is entered into Firefox’s address bar, but the browser will no longer allow FTP resources to be summoned using the HTML src attribute."
Do you use the HTML src attribute for that? If not, nothing changes for you. If yes, why the hell?
"Do you use the HTML src attribute for that? If not, nothing changes for you. If yes, why the hell?"
well, if you're a MALWARE author, and you ftp'd your malware crap onto someone's server (after cracking it) and NOW have an FTP URL to it in an HTML element, then MAYBE your malware load will break.
Yeah, TOO! BAD!!!
Refusing to load http subresources from an https page makes sense, and it is logical to apply that rule to ftp subresources from an https page. Apparently this is already the case.
But I don't see the reason or benefit to block ftp subresources in general. Following links from the article, removing FTP support is mentioned as a possible eventual goal, and I don't have a problem with that, but I don't see this change being a step in that direction.
So what?
Presumably an agile Web 3.0 node jquery Frankenstein-like monstrosity of a website won't be serving stuff from ftp, but there are sites that do.
The point of the web was to make data accessible, not try to heroically re-create desktop software, fail, and prompt browser manufacturers to throw out everything to achieve that dubious aim. As long as a page served with an 's' protocol doesn't pull in stuff with non-'s' protocols and a remote page doesn't pull in stuff from local hard drive, what's the problem?
1. Of course it does binary.
2. It doesn't matter where you pull files in from or what protocol it uses (cross-origin and non-secure/secure protocol policies not withstanding). That was the point of the web in the first place.
Can we file this one on the "Oh FFS Mozilla" pile.
@AC: No, HTML didn't and arguably still shouldn't favour any protocol. If you're saying HTML should favour HTTP(S), what happens when the next great protocol comes along?
@Steve: So a non-encrypted http page can't pull in stuff from non-encrypted ftp sites because security?
@Steve: So a non-encrypted http page can't pull in stuff from non-encrypted ftp sites because security?
I said nothing one way or the other about that. I wouldn't cite, as such, security as a reason to avoid that, since the browser will normally use "anonymous" FTP (username = "anonymous" OR "ftp", password = email address) if it has to transfer anything over FTP.
However, there is a performance issue - if the "cited" item comes from the same server *by HTTP* as the non-encrypted citing page that was also fetched by HTTP, then the browser can (if using HTTP/1.1) reuse a connection, but if the cited item is FTP, it can't.
Well you did talk about encryption and I answered...
If performance is an argument for not allowing pages to pull in stuff from FTP, we might as well shut down HTTP/1.0 because it's too slow too.
The important thing is to make the data accessible, but it seems lately that's deemed unimportant.
If you need to FTP files, then TFTP is much better!
At least then you know and are aware that there is no security and no user-name/password and thus (hopefully) need to put a security wrapper around anything you transfer.
So for example: and AES-256 Password encoded ZIP file - sent over TFTP.
TFTP was designed with small networks in mind for cheap devices on the same LAN to load what they needed at boot time. A student project I supervised found a previously unpublished buffer overflow vulnerability in a popular TFTP server. I'm glad to say the student published it as a Metasploit module so it's not a zero day any longer, but it could just as easily have become one. FTP was designed for wide area networks so has been subject to much more thorough code review. Also TFTP is likely to live inside many embedded devices which don't get automatically updated when the developer who probably no longer cares learns about a vulnerability. So a hacker who has compromised a host inside the LAN and wants to compromise other hosts there is likely to be looking for unpatched TFTP hosts as these deliver software to other hosts on the LAN, as well as being likely to offer other LAN services useful to them.
I suspect that an installation that does not have a chrooted TFTP server probably has bigger problems.
Not to mention that having an intruder on my LAN is probably of higher priority than protecting against someone who wants to _read_ a copy of the firmware for bespoke lab instruments.
I don't think you read the article properly... I'll summarise for you:
- EMBEDDED <src="ftp://foo.bar.com/xxxx.jpg"> references to FTP:// resources is going to be disallowed, as this is not only a shoddy way of linking to images but also INSECURE (And rightly so).
- Typing ftp://username:password@foo.bar.com/ will continue to work perfectly, as it always has.
As an oldie I can confirm that FTP preceded the web. Time was FTP site addresses were shared on email Lists and Usenet. On the Mac we had Fetch, an FTP browser and agent, you pointed it at a site and the files available were listed for your perusal and download.
That was how shareware was distributed back in the day. I'm pretty sure I got Exile 1 via FTP (now it's Avernum of course.
I first became aware of the existence of the Internet sometime in the mid eighties I looked at the /etc/hosts file on a Unix box we were using on our yellow coax Ethernet LAN and saw the pre-DNS mappings of a couple of thousand hostnames to IP4 addresses. We were using telnet and FTP software over local TCP/IP then nearly a decade or so before the first webserver and http clients were available.