Duh?
We have added security measures, including the introduction of encryption
You gotta be kidding.
Evaluating the security of IoT devices can be difficult, particularly if you're not adept at firmware binary analysis. An alternative approach would be just to assume IoT security is generally terrible, and a new study has shown that's probably a safe bet. In a paper distributed last week through preprint service ArXiv, …
Their primary concern is harvesting as much of your personal data as they possibly can.
I disagree, most of these are hardware manufacturers. Their priority is always (and only) speed to market and low cost. The thin line between success and failure is about keeping R&D, product development and the bill of materials as low as possible. Designing for long life is rarely profitable, and creating high quality software is likewise a higher cost that doesn't lead to better selling prices or greater sales volumes.
Software in particular is an undesired complexity that manufacturers have to offer for these types of products, and software maintenance is something they might have once read about. As a hardware maker they'll struggle to attract top software talent, the time and budget for both development and testing is somewhere approaching nil, and there's a good chance they'll contract out software writing to whoever promises to do it cheapest. The money they'll make from data is totally insignificant, and they know this. This is why software and UX/UI stink on most software-attached-to-hardware, from a digital alarm clock, through TVs, white goods, all the way to cars. The only real exceptions are where the software is done by a dedicated software house that is big enough to order the hardware makers around (eg Google).
Now for Google IoC devices, that would certainly be about grabbing as much personal data as possible, but that's a different case altogether.
That is why it's called Surveillance Capitalism.
Upvote for the use of Surveillance Capitalism.
I'm hoping my friends will pay attention when I point them in the direction of this book:
https://profilebooks.com/surveillance
rather than screaming at them 'YOU'RE DOOMED, ALL DOOMED!'
Having looked at the paper, things appear to be even worse that the summary in the article suggests. For example, having hardcoded encryption keys implies that there is actual encryption involved, and even the introduction in the paper makes some comments on how it might be possible to use clever techniques to try to figure out where a key might be held and how to reverse engineer it. They then go on to give a detailed analysis for the one app out of the four chosen for analysis that actually had any encryption at all... only to reveal that the "encryption" in question is actually just a Caesar shift and the "key" is simply the "x" in "ROT-x". Yes, a modern app with all the power of modern mathematics and computers to do encryption actually uses a technique that could be trivially broken by hand over 2000 years ago. With that level of cryptography in play, the fact that the key is hardcoded is far from the biggest problem.
So sure, this research could be seen in a good light as showing that 50% of IoC devices are actually somewhat secure. But don't count on them remaining secure if anyone born between Ancient Rome and now is able to look at them. Just because cryptography is technically present does not mean an app is in any way secure.
Cryptography is only there to check a marketing requirement, allow to print "Encrypted Communications: Military Grade Safety!" on the product's box.
The fact is, nobody can say (in court) there is no encryption, and the marketing blurb never said anything about efficient encryption, so there is no actual problem for them to solve.
(Besides it won't bother the kind of innocents who buy this stuff: "Uh, encryption, isn't that what spies use? Why would I need it?")
yeah and ROT-13 if you want a symmetric "key"
ancient code methods included tattooing a secret message on a slave's head, letting his hair grow, then sending him out to deliver the message. Early invisible ink with lemon juice also. But these are both examples of steganography, not actual crypto.
The wikipedia page on History of Cryptography mentions the 'Atbash cipher' from around 2500 years ago, where you take an alphabet and map it to its reverse [rather than ROT-x]. So A becomes Z, B becomes Y, etc. until M and N swap for one another. Apparently use of this cipher even made it into the Bible, where 'sheshach' is an 'atbash' of 'Babylon' in Hebrew (other examples also exist, and some are on the wikipedia page for 'Atbash cipher').
As for military grade ROT-x, that was the 'Caesar Cipher' mentioned earlier, that according to the wikipedia page, was used by Julius Caesar in his private correspondence.
(so yeah some references for all o' this, you're welcome, a few seconds in a search engine and some entertaining reading)
Although admittedly the Kasa encryption algorithm is rubbish, it's clearly not the Caesar cipher. The code illustrated shows that a starting key is XORed with the first data byte, the result is XORed with the next byte, and this is repeated iteratively to the end of the message. Consequently the key changes for each byte of the message. Not that this helps much, but it would be a little bit harder to crack unless the code were known.
For reference, the Caesar cipher implements a fixed shift throughout the message, so every character is substituted by the character n places distant. The only non-obvious element is the keyword (if used) which scrambles the letter order of part of the template alphabet.
The code illustrated shows that a starting key is XORed with the first data byte, the result is XORed with the next byte, and this is repeated iteratively to the end of the message.
So it's amateur-grade CBC with an 8-bit block. Trivially broken. In some ways this is worse; it smacks of a little knowledge, in the Popean sense.
end to end encryption in small footprint devices may be more difficult than perceived...
Unless they have ARM processors instead of microcontrollers, with significant NVRAM space for encryption libs, and so on, the libraries just won't fit on the devices, and the algorithms are likely to run too slowly anyway.
What needs to happen for this to work is some help on the silicon end, like hardware networking with built-in SSL support. There is ENC624J600 (which was around $4 on digikey's site) for ethernet, but I don't know of any wifi bolt-on systems with similar built-in SSL capability. They're obviously NOT "the cheaper solution" on the hardware end. Supporting SSL and related protocols needs quite a bit of firmware.
As for phone applications and the cloudy IoT servers, someone should be PUNISHED if NOT using proper encryption and security at THAT level... it's too obvious of a need, and too easy to integrate. NO EXCUSE!
/me points out that Android has a free library or two (probably built in, probably uses webkit and/or OpenSSL) that handles 'https' requests, and I'd bet iOS does, too. For desktops there's OpenSSL.
I've seen work on tiny cryptography since the 90s (sci.crypt on USENET was very busy with it, for example), and I'd be surprised if all the work on it vanished in the mean time.
In fact, searching on "tinycrypt" found me Intel's contribution (github link here), and it seems that's not the only library that's used the tinycrypt name.
Now it could be argued that even adding this would be an intolerable burden to the manufacturer, but given all the features that a "smart" device is supposed to have, I think it would be worth adding this to the list of must-haves.
> the libraries just won't fit on the devices, and the algorithms are likely to run too slowly anyway
These devices do WiFi, and the WiFi link needs something like AES. If the hardware is capable of doing that, plus all the protocol gubbins for DHCP, DNS etc. etc. then also encrypting the payload is a small additional overhead.
Hardware limitations are not an excuse for lack of crypto. I've gotten AES-128 running rather well on PIC and AVR micros. Even the venerable 8051 can have a go. For an efficient implementation there is no need for the performance of an ARM.
Please visit: https://cr.yp.to/mac/8051.html
Because it's a lot easier to reverse-engineer Android apps than iOS apps. This is a definite weakness when you're burying hard-coded encryption keys inside an app, but the implication is that the iOS versions of many of these apps use hard-coded keys as well.
Given how much IoTrash seems to be Internet enabled for nothing more than the novelty value, I wish I could say I'm surprised. Not much thought is going to be given to security if the product is expected to be chucked in a drawer and forgotten about after a couple months.
We can be matter-of-fact about IoT being in a race-to-the-bottom at the expense of security/etc.