I'd be tempted to make a tiny mobile app with Firebase just to receive notifications, though I'm not sure how you'd get it past Apple reviewers.
Sending a broadcast message with GCM or APNs is free. Sending SMS is relatively expensive.
An infosec pro fed up of having to follow tedious Twitter accounts to stay on top of cybersecurity developments has set up a website that phones you if there's a new vuln you really need to know about. Bugalert, founded by product manager Matt Sullivan, is a crowdsourced venture that he hopes will take the pain out of trying …
I used to be "On Call" in a big multinational telecom company some years ago.
They would give you bonus + expensive equipment to receive the calls and do the job.
Unfortunately one of the engineers once left the company and took the equipment and tools to the adversary.
Management got really angry.
Now they need to make an app that will read all these notifications and "take care" of them.
What usually happens is that when there is too many of those and most of them are irrelevant, the engineer sets up a filter and get them to land in a folder bypassing the main mailbox. Ignorance is bliss.
When something hits the fan? "We get so many alerts, we must have missed that one".
Exactly this :(
This morning my 'Alerts' folder had 261 alerts, ranging from server down, system down, HA unavailable, latest vulnerability patch from Aruba, cert expiration warnings and so on and so forth.
Ops doesn't get say in 50% of how these alerts are set. Random project team sets up new system, creates alerts... not sure which ones they should use...*shrug...select all, assign to Ops team distribution list...
Been a sys eng for 18 years now and i just let alerts wash over me like a wave. No chance to review them, definitely no chance to deal with them, no chance manager layer will agree to more sys engs in my team..
So I just add a line to our Risk register every 3 months saying ' not enough resource, can't cover all vulnerabilities '
Management review register every 3 months and I never hear a thing back.
Oh well..
That's being set up for failure and low moral. Nobody should be put in that place.
Treat yourself better than they do, offer a solution (staffing) with a "there arn't worse jobs out there" or "I can't watch the place burn down anymore, it's to hot in here" type notice. You deserve better.
One important piece of context missing from the article is that alerts are only sent for vulnerabilities bad enough that you should literally be waking up out of bed to handle them. We're talking one to two alerts a year. More info: https://mattslifebytes.com/2022/01/04/bugalert-org/
I understand the intent, but the issue is determining what issues merit inclusion.
The recent Patch Tuesday summary included the following number of issues rated critical:
Microsoft - 9
SAP - 1
Adobe - 16
Mozilla - 9
Android - 1
Which of these critical issues are extremely critical enough to wake people up and how will that be determined?
I wrote a large web/db program that emailed devs when bugs/problems were found. When the 'too many bugs emailed' bug was found we designed a dynamic filter for it so only the important bugs got emailed. The bugs/problems were all stored in the DB (unless the DB was the problem in which case the emails went out!) and made fascinating reading and analysis and really helped improve the way we approached almost anything.
Toms first law of computing: Laziness is the mother of invention. If two days coding can save you 3 days of shit do it. Cost that 3 days is going to become 3 days a week soon,