So, avoiding public wifi and using your own 3G/4G connection is the order of the day - i.e. a connection that you know as much as possible about - preferably with a VPN back to a known endpoint (like back to your house)
Hacked in a public space? Thanks, HTTPS
Have you ever bothered to look at who your browser trusts? The padlock of a HTTPS connection doesn't mean anything if you can't trust the other end of the connection and its upstream signatories. Do you trust CNNIC (China Internet Network Information Centre). What about Turkistan trust or many other “who are they” type …
COMMENTS
-
-
Friday 20th May 2016 15:28 GMT NoneSuch
Any American created communications security protocols and standards should be considered unusable after the Snowden revelations. They are designed with inherent flaws from day one and cannot be trusted.
Enigma in WW2 had (at best) 88 bit security which was broken by maths and mechanical machines in days. Today, some seventy years later, the best we are allowed to use (by US Commerce legislation) is 256 bit. And no one sees this to be an issue.
Snowden only told us what he knew. I'll bet there is a lot he was never given access too. A universal review of all encryption methods needs to be done with the US barred from the room. Think of it like the metric system. The rest of the world can use it and the Yanks can keep their inches.
-
Friday 20th May 2016 18:09 GMT Anonymous Coward
"A universal review of all encryption methods needs to be done with the US barred from the room."
Trouble is, EVERYONE has the same idea because they're after the same thing: the key to the treasure vault. Fact is, NO government can be fully trusted with this issue since, as other documents have revealed, they're really no better. Which leaves the only alternative which is, unfortunately, anarchy. If you're not under a government's thumb, you're under a bully's thumb. That's life: dominate or be dominated. If you're neither, you've just been overlooked for the moment.
-
Saturday 21st May 2016 04:28 GMT Anonymous Coward
Well, what do you expect from a network designed by pot smoking libertarian hippies paranoid of their government who designed everything while sucking at the DARPA funding tit as they worked for AT&T, et.al.
In their wish for no control, we got an insecure by design but masterfully self-healing interconnection of all the computers in the world and have been applying bandaids and sticking plasters ever since in an attempt to provide security and privacy. Good luck with that!
-
Monday 23rd May 2016 06:30 GMT Anonymous Coward
"In their wish for no control, we got an insecure by design but masterfully self-healing interconnection of all the computers in the world and have been applying bandaids and sticking plasters ever since in an attempt to provide security and privacy. Good luck with that!"
Yep, that's about the size of it. But I'm not sure that it could ever have turned out differently even if anyone had wanted to.
The Internet has two conflicting requirements.
1) Allow people to transfer data, browse and do whatever they want, with no identity check for network access, control or prevention. (this is for the good guys).
2) Stop people transferring data, browsing and doing whatever they want, having established their identity and applying control and prevention (this is for the bad guys).
Contradictory, no?!
Technology fundamentally cannot resolve that contradiction because the Internet cannot distinguish between good guys and bad guys. Only humans can do that, but even then we almost always disagree on who is good and who is bad.
It's made harder because there's really no way of working out who is at the end of an IP address without going round there and knocking on the door. There is no technology we have than can irrefutably establish the identity of the person originating some network traffic. We can get close, passwords, biometrics, etc, but passwords get written or down or lost/stolen, and biometrics are too easily fooled. Short of having some sort of ID chip implanted at birth somewhere where no one would be prepared to have surgery at a later date (another contradiction), there's no irrefutable way of doing it.
And without that there's no way on the network of reliably telling who's who, and without that we're doomed to have a network pretty much like the Internet currently is.
-
Friday 27th May 2016 10:19 GMT Charles 9
"And without that there's no way on the network of reliably telling who's who, and without that we're doomed to have a network pretty much like the Internet currently is."
In other words, the Internet is going to become a doom zone no matter what because it can either be stateless (and eventually a zone of anarchy) or stateful (and eventually a police state). It's Pick Your Poison with no third option available because "they know you" and "they don't know you" is a strictly binary state.
-
-
-
Monday 23rd May 2016 08:43 GMT MacroRodent
Net history
DARPA didn't invent HTTP or the WWW. One guy at CERN did that.
I'm pretty sure the poster was talking about the underlying protocols (TCP/IP, UDP) whose development indeed was funded by DARPA. Actually ARPA a the time; the net used to be called the ARPANET, and Internet happened when that was opened up to other users besides U.S military, its contractors, and academic institutions.
The great achievement of the "one guy at CERN" was making the data on the internet approachable by the average guy, and in a way that scaled up without central control. And of course making the idea and code available for all for free. Had this been a typical commercial effort, with everything patented, there would have been multiple incompatible and very expensive webs.
-
-
Tuesday 24th May 2016 18:15 GMT Michael Wojcik
Re: Net history
I'm pretty sure the poster was talking about the underlying protocols (TCP/IP, UDP) whose development indeed was funded by DARPA. Actually ARPA a the time;
The agency was renamed in 1972, shortly before the first TCP/IP specification was published. So if you want to be completely correct, TCP/IP development was funded by ARPA and then by DARPA.
The great achievement of the "one guy at CERN" was making the data on the internet approachable by the average guy, and in a way that scaled up without central control
Actually, we already pretty much had that. For the first couple of years HTTP and HTML didn't do much that Gopher / Veronica / WAIS / etc didn't also offer. HTTP and HTML succeeded for a few reasons: hypertext had better usability than separate documents and menu-style links; even with character-mode user agents HTML offered some basic presentation markup; HTTP (prior to the barbarism that is HTTP/2) is easy to drive by hand for experimentation and debugging.
Most importantly, though, the time was right. Graphical workstations were becoming common enough (helped by academic efforts like the Andrew Project and Project Athena) that it made sense to create graphical user agents. Not many people had NEXTstations, but quite a few had some sort of X11 box, so NCSA Mosaic (and to a lesser extent other early GUI browsers like Erwise, Spyglass Mosaic, and Viola) became a showpiece for the web. It wasn't much more functional than Gopher+WAIS, but it was prettier.
-
Wednesday 25th May 2016 20:43 GMT anonymous boring coward
Re: Net history
"I'm pretty sure the poster was talking about the underlying protocols (TCP/IP, UDP) whose development indeed was funded by DARPA."
I'm pretty sure he wasn't (or had no idea what he was talking about) as those underlying protocols have nothing at all to do with any security issues in HTTPS or other protocols used in the WWW.
Besides, HTTPS wasn't invented until later. Early WWW didn't have it. It didn't have any Javascript or Flash either.
-
Friday 27th May 2016 17:20 GMT patrickstar
Re: Net history
You could easily imagine a network where things like authentication and confidentiality are built right into the network itself.
At least this could be applied to the distribution of routing information to prevent things like, say, all traffic to Youtube suddenly going to Pakistan, or any more recent BGP hijacking incident.
-
Wednesday 1st June 2016 12:18 GMT anonymous boring coward
Re: Net history
"You could easily imagine a network where things like authentication and confidentiality are built right into the network itself."
What would be the point of merging several network layers into one?
Besides, always demanding authentication would probably dilute the value of it.
-
-
-
-
-
-
-
Friday 20th May 2016 19:10 GMT patrick allen
Snowden, the thief
Ahhh, Snowden, our fav pal whom we want to see come back to the good ole U.S. of A....
You must have meant "what he could steal" rather than "what he knew".... right?
But yes, he could only access so much, even with the borrowed CAC that he borrowed from a mate who should not have loaned it to Snowden (that is, if he loaned and Snowden didn't snatch it).
-
-
-
Friday 20th May 2016 11:41 GMT Anonymous Coward
The traffic is going via the hacker's laptop, since they are masquerading as the router. This means they can establish the HTTPS connection between the user and their laptop. The user sees a secure connection, unaware that the traffic is being decrypted and re-encrypted by the hacker - hence "man in the middle" attack.
-
-
Friday 20th May 2016 12:06 GMT Lee D
It's not that simple.
As someone who deploys MITM for school web filters, the end devices HAVE to trust the root certificate used by the MITM. You can't just pretend to be google.com without the browser throwing a fit. In schools and companies, you do this by distributing your web filter's SSL root certificate into the device's local trusted certs.
Otherwise, Chrome etc. will throw a fit. iPads will stop updating as they detect that the apple.com cert they're using isn't signed by the right authorities. And no genuine CA will issue global wildcard certs (they have been issued for "their own webfilter devices", etc., but they catch an awful lot of flak and there's a massive backlash).
Additionally, modern browsers use certificate pinning and certificate transparency. If the claimed root isn't the one that Google actually bought their certificate from (e.g. Verisign instead of RapidSSL or whatever), the browser will throw a fit again. You will get interception warnings, red bars, and no security.
So the article is wrong, unless people are stupid enough to agree to install random certs into their browser (game over anyway). I have a device in school that intercepts all SSL, decrypts, analyses for keywords, and then SSL's it again to send upstream to website. But you can't do that without a lot of client interaction and basically control of the client machines. You don't have that on just a guest wifi (hence our guest wifi presents security errors for lots of sites, but it's free so what do you want?).
MITM is possible, but doing MITM without the browser detecting the situation is almost impossible these days. And as everything from Google's home page to iPad updates routinely use SSL, you can't just start faking certificates without breaking a lot of things. Hence you have to deploy an iPad "profile" with your webfilter certificate, a similar thing for Android, or put something into certmgr.msc on Windows PC's. Even then, something don't like being subverted, and certain websites "know" they are being intercepted and refuse to secure themselves. Users on such networks just have to live with it, or you have to exclude them from HTTPS decryption.
It's NOT just a matter of sitting in between the connection and pushing even CA-signed fake certs back to someone's browser. You need their device, or their co-operation.
-
Friday 20th May 2016 13:14 GMT Platypus
Thanks for clarifying that.
The one nugget of truth in the article is that the list of CAs built in to browsers etc. is ridiculous. I had occasion to look recently. I'll bet at least half of those organizations are corrupt or compromised enough that I wouldn't even trust them to hold my hat - let alone information I actually value. Anybody who wants a signing cert for MITM can surely get one. That really does cast doubt on whether HTTPS is really doing us all that much good, but it's important to understand exactly where the weak link in that chain is.
-
Friday 20th May 2016 13:51 GMT Anonymous Coward
It's not the list, per se, that's ridiculous. It's the concept: Arbitrary superplustrusty third parties... bestowed omniscience at the whim of myriad equally-arbitrary arbiters?!!!one Marvellous.
Still, how would Finfisher, Prism et al possibly do their thing if our interwebs were actually secure?
B0rken. By design.
-
Friday 20th May 2016 15:03 GMT Platypus
I'm not going to disagree with you, there. Centralized trust doesn't work any better than centralized anything else. The only thing I'll say is that the browser makers have made the whole thing even less secure than the design allows by shipping certs for all these shady companies - many of which are clearly just arms of equally shady governments in various forsaken parts of the world. A chain of trust can still be strong if the links are all strong. It's a problem that this becomes hard to guarantee as the chains get longer, but it's also a problem that the browser vendors *knowingly* include weak links in the bags they provide.
-
-
Friday 20th May 2016 18:17 GMT BillG
The one nugget of truth in the article is that the list of CAs built in to browsers etc. is ridiculous.
Wow, I just checked the list of trusted certificates on my work computer and it's almost 300. There is a scary one from my employer with the two purposes "All issuance policies" and "All application policies".
I remember when the used to be about a dozen trusted certificates and you could recognize the issuer of each, like "Verisign", "Thawte", or "Microsoft". Now, I've got a certificate issued by "TÜBİTAK UEKAE Kök Sertifika Hizmet Sağlayıcısı - Sürüm 3" (sic). Really???
-
-
Monday 23rd May 2016 16:03 GMT Lee D
Windows and IE updates regularly add new ones and take old ones away.
Not to mention, everything from banking software to firewall software etc. will add their own.
Although a "this is what you should have" program is probably a good idea, actually it'll be just as much pain to maintain as the root certificate list itself, especially tracing the origin of a cert that someone else has that you don't, etc.
-
-
-
-
Friday 20th May 2016 14:02 GMT mathew42
Corporate networks decrypt SSL
> So the article is wrong, unless people are stupid enough to agree to install random certs into their browser (game over anyway).
Or installs some malware into the machine, or has to install a certificate to connect to a VPN, or ...
> I have a device in school that intercepts all SSL, decrypts, analyses for keywords, and then SSL's it again to send upstream to website. But you can't do that without a lot of client interaction and basically control of the client machines.
So do most corporate networks and almost all corporate PCs are connected to a domain which gives that level of control. In summary if your computer belongs to a domain you can assume that the corporate firewall is decrypting your traffic.
-
-
Friday 20th May 2016 20:23 GMT Mark 85
@sux2bu -- Re: Corporate networks decrypt SSL
Two things to remember. Never underestimate the power of a stupid voter and just as important, PEBKAC! (Problem Exists Between Keyboard And Chair)
Exactly. If you walked up to the average person sitting in a coffee shop with their laptop and asked them if they had the padlock on the URL bar, most will give you a blank look. They have no clue what it means or do they have a clue what a MitM attack is. I suspect that if the average user did have a clue, they wouldn't use their laptop in the coffee shop. Or, they would be using VPN, etc. Most don't even know the first thing about VPN, like what it means.
-
-
Sunday 22nd May 2016 02:21 GMT Vic
Re: Corporate networks decrypt SSL
Or installs some malware into the machine, or has to install a certificate to connect to a VPN, or ...
None of that is an SSL issue, they''re all trsut compromises.
In summary if your computer belongs to a domain you can assume that the corporate firewall is decrypting your traffic.
This is why I use an invalid certificate on my server - if I *don't* get a warning, I know someone is intercepting my traffic.
Vic.
-
Sunday 22nd May 2016 13:45 GMT Anonymous Coward
Re: Corporate networks decrypt SSL
This is why I use an invalid certificate on my server - if I *don't* get a warning, I know someone is intercepting my traffic.
Devious use of Canary Cert.
However, it would be better if J. Public could script special checks into the browser without extensive knowledge of plugin magic. Just open vi, tap some Lua, and additional checks have been implemented.
-
Monday 23rd May 2016 00:27 GMT Mark 65
Re: Corporate networks decrypt SSL
HTTPS everywhere and SSL observatory, courtesy of EFF. Always be wary of free wifi - I certainly wouldn't use it unless I was using laptop with a Live CD due to the possibility of malware let alone MITM. Plenty are poorly setup, maintained and secured. Think of using it as being like having unprotected nooky. You may get away with it quite a lot, but then again....
-
-
-
-
Sunday 22nd May 2016 02:16 GMT Vic
So the article is wrong
The article is very wrong.
Take a look at Moxie Marlinspike's page on sslstrip. It doesn't do anything like what the article claims.
Really, this article is very poorly-researched. You might want to spike it...
Vic.
-
Friday 20th May 2016 12:07 GMT Preston Munchensonton
I understand what a man in the middle attack is but I don't understand why the user's browser would think its receiving data over an ssl connection.
Because there's two separate HTTPS connections: one from the user to MITM and the other from MITM to the real destination. This is exactly how Bluecoat or Cisco recommend deploying their proxies, with an internal SSL CA providing the cover to prevent browser warnings to users.
-
-
-
Friday 20th May 2016 12:21 GMT Anonymous Coward
sslstrip downgrades the connection, but also tries to give enough fake visual feedback to make the user believe the connection is secure. If you're skilled and cautious enough, you may catch it.
If the attacker is able to feed you a fake certificate it could be a little more difficult if you don't check the certificate and its chain. Extended validations one may be a little better, but it all comes down to trust the allowed CAs...
-
Friday 20th May 2016 13:50 GMT Anonymous Coward
SSLstrip substitutes a fake "padlock" icon for the site's favicon. Crude but effective.
"SSL Inspection" proxies the victim through an actual HTTPS connection, so it's less obvious, but the attacker must install their own root cert on the victim's computer (corporate PC, or via malware, or via dumb PC manufacturers) - unless they've obtained the private key for a "real" root cert...
-
Friday 20th May 2016 14:12 GMT Dan 55
Somewhat ironically I think it's never been safer to rely on HTTPS. Browsers don't let any old thing pass any more.
Favicons only appear in browser tabs now. If I saw a padlock in my browser tab or HTTP for a HTTPS site like gmail, I would close the tab.
If you want to check the certificate authority you just click on the HTTPS and you get the certificate authority. If it's Turktrust or something strange then something's obviously wrong.
Having a fake MITM on a company laptop is mitigated with Firefox which doesn't use the OS's certificate store.
-
Tuesday 24th May 2016 18:19 GMT Michael Wojcik
the attacker must install their own root cert on the victim's computer (corporate PC, or via malware, or via dumb PC manufacturers) - unless they've obtained the private key for a "real" root cert...
It's enough simply to compromise a CA that's trusted by the user agent. You don't need the private key for one of the CA's roots or intermediaries (though that does the job). Get the CA to issue you a certificate for a well-known site, signed by a root/intermediate that's trusted by browsers, and you're home free.
And CAs have been compromised many times - that we know of. And those are just the major ones. Of all those little regional CAs in the browser trust list, how many even have auditing practices sufficient to have a decent chance of knowing whether they've been attacked?
-
-
-
Friday 20th May 2016 12:51 GMT Anonymous Coward
> I don't follow this. Surely if your traffic is being intercepted and redirected to HTTP you don't get the browser padlock?
Yes that's true - but most people are fooled if you simply replace the site's favicon.ico with a padlock image. Plus, browsers don't give any negative security feedback simply because you are accessing a site over HTTP.
The original presentation is worth reading:
http://www.blackhat.com/presentations/bh-dc-09/Marlinspike/BlackHat-DC-09-Marlinspike-Defeating-SSL.pdf
-
-
Friday 20th May 2016 18:32 GMT Anonymous Coward
> In 2016? It's been a few years since browsers were showing the site's icon in the same place as they would show the padlock icon.
Firefox and Chrome, yes. Palemoon still shows a favicon in the url bar - with red/green/blue colors for various levels of HTTPS. Not that the average hacking victim would notice.
Just to be clear: as far as your privacy/security is concerned, HTTPS is worthless.
-
-
Friday 20th May 2016 19:04 GMT Destroy All Monsters
Llink to Moxie Marlinspike's talk at the Black Hat DC 2009 conference as given by AC above
New Techniques for Defeating SSL/TLS
This presentation will demonstrate some new tools and techniques that allow attackers to silently alter, inject, and log traffic intended for secure transmission by SSL/TLS in common web applications such as online banking or secure webmail logins. It builds off of the SSL exploit tools and research on the failure of browsers to validate BasicConstraints that I published in 2002, and will include demonstrations of a new tool for exploiting current use patterns as well as some data gathered from field testing in the real world.
Not sure how far we have come since 2009...
-
Friday 20th May 2016 21:37 GMT Anonymous Coward
Not sure how far we have come since 2009...
Yea, ARP poisoning is so last century. These these techniques will only work if your client browser is configured to accept fake certs, such as required for current SSL monitoring, which need to decrypt SSL traffic in order to scan it for vulnerabilities, which defeats using SSL in the first place.
-
-
-
Friday 20th May 2016 15:37 GMT Anonymous Coward
I think Gibson Research has the best explanation of how HTTPS is intercepted. https://www.grc.com/fingerprints.htm
The basic trick relies on our trust in the certificates between us and the other side providding encryption But inn the model a machne owner can sppof to be a higher authority and replace the original certificate.
-
-
-
Friday 20th May 2016 12:15 GMT Ben Tasker
Some go further that that and are included on a list pre-baked into the browsers. So on a virgin install of Chrome (for example), if you enter http://www.google.com it should change to HTTPS without bothering to try port 80.
Helps to remove the inherent risk in just HSTS when talking about users who're visiting your site for the first time.
-
-
Friday 20th May 2016 11:39 GMT Anonymous Coward
"If a site provides only HTTPS then sslstrip would fail as it can't fall back to HTTP."
Why does it need to fallback to HTTP? Surely the hacker (or more accurately their software) can decrypt the request from the unsuspecting browser user, then use his or her own HTTPS connection to the website to forward the request and decrypt the response before re-encrypting it to send back to the user?
-
-
-
Friday 20th May 2016 17:11 GMT Ken Hagan
Re: if Google's private keys really WERE stolen...
It depends on how quickly Google learned about the theft and how quickly the revocation notice was picked up by the average Joe's machine. (I think browsers are better about this than they were ten years ago.) During that window, I think quite a lot of damage could be done. For starters, since Chrome is open source, it would presumably be possible to distribute a Trojan-ed version of Chrome that ignored the revocation notice.
-
-
-
Saturday 21st May 2016 15:52 GMT Havin_it
While preparing to answer this I realised I wasn't sure either, but I think it goes like this:
Browser generates a random temporary (symmetric) encryption key, encrypts it using the server's certificate (which is its public key) and sends this as part of its request to the server.
Server decrypts the browser's key with private key that only it has, and uses it to encrypt and decrypt everything from there on.
MitM can't communicate back to the browser because they can't decrypt and use the key that the browser is expecting them to use.
(Well they could try, but you'll know they're at it if there's an unusually large fruit-machine in the corner with "D-Wave" written on the side lol).
That was sort of an exercise to self, in case you couldn't guess ;)
-
-
Friday 20th May 2016 11:59 GMT Anonymous Coward
this is so old...
Marlinspike presented this back at Black hat in 09.... it was great back then when noone used HSTS.... less useful now... I'm lazy and just redirect ports 443 and 80 to my burpsuite and activate the bypass on HTTPS connection failure feature... and you always get loads of traffic... most sites still don't have HTTPS, let alone HSTS... then you can just inject BEEF redirects into every HTML page you intercept so you start getting man-in-the-browser attacks going and an eventual meterpreter shell... you can always count on people just clicking through warnings of broken padlocks... works like a charm....
-
Friday 20th May 2016 13:07 GMT Anonymous Coward
Have I fallen into a timewarp?
Even at $ORK, we have looked at certificate pinning. (Even if we're currently saying "not yet", because we'd need to have a spare keypair around and always updated, and our upstream CA doesn't have a second keypair itself – we'd be fairly happy enough pinning to "our certificates are always issued by academentia-CA")
Also, not surprisingly, browser support seems to be "FF and Chrome".
-
Saturday 21st May 2016 03:45 GMT Adam 1
a couple of misleading statements in the article
Firstly, a MitM scenario is what we call "the norm". It is highly unlikely that you have a direct connection from your computer to the server. There are most likely a dozen networks that get traversed. It is not some afterthought that the guys behind HTTPS didn't consider
Being a MitM allows you to 1. Observe and 2. Manipulate any bytes traversing that link. For HTTP, that means that pages can be manipulated and any credentials can be easily obtained. Some popular IT news websites even fail to use HTTPS in their comments if you can imagine that. Equally, mixed HTTPS via a HTTP page is not safe.(eg).
But HTTPS is different. The design of HTTPS is that your browser demands the site prove that it owns a certificate by signing a random challenge issued by the client. The server gives it's public key which can be used to decrypt the response and reveal the original challenge, the certificate is signed by a trusted authority, which hopefully means some diligence was done that the issuer. Without getting a hold of the private key of a CA, or otherwise convincing them that your certificate should be signed, you will either have an invalid signature or a CA that your browser has never heard of. In both cases, your browser will make it known to you that it isn't satisfied.
The theory works, setting aside whether the CAs are trustworthy. The problems are in the implementations. The Apple GOTO fail bug was basically a failure to validate the signature on the certificate. POODLE works by interfering with the negotiations about what algorithms the client and server have in common, and basically tricking them into communicating using a very weak key. That is easily mitigated by either the client or server having a somewhat recent security patch applied.
Sslstrip works by tricking the client into using plain old HTTP while it works as a proxy, talking using HTTPS to the website (HTTPS validates the website identity, not the client identity, and you just gave your credentials to a proxy which is now emulating you.) It's not magical. It is also not going to get past hsts so I seriously doubt a modern browser is going to leak Gmail over HTTP.
-
Saturday 18th June 2016 22:05 GMT Anonymous Coward
Re: a couple of misleading statements in the article
> HTTPS validates the website identity, not the client identity
Good summary. Just to point out that HTTPS can actually validate both. This is used in some European countries to provide services to citizens (the browser uses a certificate either in its certificate store or in a smart card), and in many companies to authenticate users.
Still, it's not common, nor necessarily desirable, for the web at large, and since this "exploit" is well past its best-before date ... this is just another shot by this so-called "security researcher" to get himself some headlines. In reality the guy is frankly useless.
-
Saturday 21st May 2016 19:11 GMT Steve Davies 3
And then there is the cssspit of short URL's
An invitation to go to somewhere that is undesirable and possibly illegal.
Remember just accidentally viewing kiddie porm is a criminal act here in the UK.
If it ain't on of these siate, how do you know the the short URL you just clicked on is not owned by a bad guy.
I won't click on any short URL's, QR Codes or anything else that hides the real desitnation of the connection.
-
Saturday 18th June 2016 22:09 GMT Anonymous Coward
Re: And then there is the cssspit of short URL's
> I won't click on any short URL's, QR Codes or anything else that hides the real desitnation of the connection.
What barcode scanner do you use? Mine does show you the URL and then you can choose what to do with it.
As for short URL's, you can always use your cURLy --head. :-)
-
-
Monday 23rd May 2016 08:50 GMT Amias
Good WiFi security beats this
If the wifi you are using is encrypted with wpa2 and has wireless isolation enabled (and working , lots say they can do it but they can't) then this is not possible without breaking into the router. Sadly a surprising amount of places decide not to have a WiFi key for ease of connection. This is the case with the free WiFi on trains and is seems to be related to a bad implementation of its useless registration system
-
Monday 23rd May 2016 09:22 GMT Colin McKinnon
Must try harder
"you need to hack the coffee shop's router" - no you don't. You even explained why this not necessary in the preceding paragraph.
"Certificate pinning, though, is limited to Google sites at present" - no it's not
"Some browsers such as Chrome use a new technique called certificate pinning" - That would be Chrome, Midori, Firefox, Opera, but not MSIE/Edge currently (not aware of status of Safari).
-
Wednesday 25th May 2016 10:13 GMT Norman Nescio
CA list
I would love to be able to modify the browsers root CA list handling so that I could flag the majority of CAs as untrusted, and have the browser pop up a go/nogo message whenever an attempt was made to use an untrusted CA. There isn't even an easy (point'n'drool) way to edit current CA lists.
Even having the browser optionally show, for each web-page, which CA is being used would help, instead of having to hunt through the 'View certificate' process.
-
Saturday 18th June 2016 22:13 GMT Anonymous Coward
Re: CA list
> I would love to be able to modify the browsers root CA list handling so that I could flag the majority of CAs as untrusted
At least in Mozilla-based browsers and Android you can do this no problems.
> There isn't even an easy (point'n'drool) way to edit current CA lists.
Yes there is. Admittedly, on Firefox it takes a few more clicks than I would like to, but you only have to do this once (careful about upgrades though!)
> Even having the browser optionally show, for each web-page, which CA is being used would help, instead of having to hunt through the 'View certificate' process.
That won't help in most cases. Certificate pinning is a better option.
-
-
Tuesday 31st May 2016 22:17 GMT ma1010
We get snooped
I work at a court, and they snoop everything. They put a bogus certificate into the official browser (IE) so it never raises any problems. Nor does Chrome because it uses the Windows certificate store. However, Firefox raised alarms and let me know the day they started this. I deleted my IMAP off the work computer and changed all the passwords of any site I've ever accessed from work. I use my phone (NOT over wi-fi) for my personal email or anything else I don't want them reading.