
ahh
the earlier story starts to make sense
Chinese drone maker DJI left the private key for its dot-com's HTTPS certificate exposed on GitHub for up to four years, according to a researcher who gave up with the biz's bug bounty process. DJI also exposed customers' personal information – from flight logs to copies of government ID cards – to the internet from …
Servercredentials.txt? Really!? You are just asking to be hacked. What you should do is to call the file something more obscure like app.config, except further obscure the details by encoding them in XML.
Something like this is all you need.
<configuration>
<connectionStrings>
<add name="ProdDB" connectionString="Server=MyServer; Database=Prod; User Id=sa; password= re@Lly5Af3" providerName="System.Data.SqlClient" />
</connectionStrings
</configuration>
Remember GitLab? That "we want to be like GitHub but you'll have to pay us to keep your stuff safe"-company which utilized 6 ("six"!) different backup strategies to keep your data safe, but then never bothered to check on any of them so that in the end they ended up empty handed when they actually needed their precious backups?
I don't know about you, but all of a sudden they seem pretty harmless right now.
Because let's be honest: most of us have been there, the moment you notice that your backups are crap is when you actually need 'm.
But that really fails in comparison to what we're see happening with AWS (and now Github) as of late. Don't the "IT professionals" these days understand the difference between public and private repositories anymore? Are they really that stupid that they don't realize that private keys which are even referred to as that should be kept private?
From the 'req' OpenSSL manualpage:
-pubkey
outputs the public key.
-newkey arg
this option creates a new certificate request and a new private
key. The argument takes one of several forms. rsa:nbits, where
nbits is the number of bits, generates an RSA key nbits in size. If
nbits is omitted, i.e. -newkey rsa specified, the default key size,
specified in the configuration file is used.
How obvious do they have to relay any of this information?
Oh wait... do these guys actually read manualpages or have they become too "special" for that?
And on that subject: do you really have nothing to hide anymore? If "IT professionals" are this careless with their own data, then what do you think they'd do with data which doesn't really matter much to them. For example yours?
I may be a neophyte CA admin (I'm good with basic care and feeding and whatnot, but for anything super complex I call to someone in who uses that hat day in day out) and even *I* know that you guard your private certificate keys heavily, and restrict who has access to them.
and the NDA shenanigans? that's not surprising at all given a few assumptions. (the small chunks of it that were in the PDF look a *lot* like 'schmuck bait' to me.)
Selling Boeing-botherers to the "up close and personal" wing of plane spotters has only so much potential, getting robotics out to military and goverments has much greater data gathering scope and a more reliable revenue stream.
Also save having to WEEE recycle unsold inventory, just flog it to a government (after the approvals process completes obv.).
but a mistaken git add . is quickly done. Are there any good automation strategies to ensure that secrets don't get uploaded by mistake?
I know one approach is to keep passwords and credentials out of code and get them from environment variables or special vaults. That's more of a if-you-code-correctly-then-it-wont-happen safeguard, but what what about something that is automatically paranoid about what does end up in the uploads? Or watches over local repositories that have external remotes?