back to article Pupil mental health monitor promises app rewrite after hardcoded login creds discovered

A Brit biz whose mobile apps monitor the mental state of 35,000 British schoolchildren is having to rewrite them after researchers found hardcoded login credentials within. "Tracking steering biases is a pioneering technique developed by STEER using AI to identify patterns of bias linked to mental health risks in 10,000 test …

  1. deive

    This is getting beyond contemptible now.

    This should be programming 101.

    It reflects so badly on the rest of us who dedicate our time to learning and doing things correctly.

    1. macjules

      Sloppy coding and deployments are run of the mill now, despite there being an abundance of security checklists for such common requirements as how to 'harden' response headers for Varnish or Cloudflare caching or how to set up SSL certificates. Lots of companies (TCS, Accenture to name just 2) often fail dismally at checking to see if a deployment from Test/UAT to production has removed development credentials etc.

      This is usually coupled with a reluctance to pay the extortionate fees required for full penetration testing.

      1. steviebuk Silver badge

        And becomes the managers just "Want stuff done now" for "Market and/or release" "we have to hit this madeup target" despite being told time and time again it won't look good if there is then a security issue. "Fuck security. Just get it done. I want my bonus for hitting bullshit targets". And because those bonuses aren't then taken back when a massive security issue is revealed, it will continue.

    2. GnuTzu

      That company needs a program to monitor its own mental health.

      Social services should declare them unfit for looking after children.

      Anyway, if nurturing is going to be done by computers in the future, what future is there--especially when those computers are likely to be IoT with apps?

    3. JimboSmith Silver badge

      I had to explain to somebody recently that it didn't matter how good the encryption was if the password was easy to guess e.g. "password1" or hard coded into the system. It took a while but eventually they got it.

  2. Wellyboot Silver badge

    spouting the obvious here

    >>>we started an investigation together with our third-party developers.<<<

    Said third party being the cheapest they could find?

    1. katrinab Silver badge

      Re: spouting the obvious here

      Or the one that provided the best fringe benefits?

      For example, elsewhere on El Reg there are articles about The Incredible Sulk's choice of IT consultant.

      1. Mephistro

        Re: spouting the obvious here

        I wouldn't be too surprised if the "third party developers" were the ones giving money to Steer.

        I mean, there's a smorgasbord of monetizable data, either by selling it to governments, or to criminals or to multinationals. For some unknown reason the separation between these three categories seems to blurry noticeably every month. Sigh...

    2. Anonymous Coward
      Anonymous Coward

      Re: spouting the obvious here

      Evaluation of third party:

      Category (Weighting)

      Functionality (5%)

      Compliance with legal requirements (1%)

      Audit (0.5%)

      Support (0.5%)

      Cost (93%)

  3. David Bird

    Every company after a data leak

    Profitability and bonu ... sorry, we mean data privacy and security are our top priorities.

    1. MarkItZer0

      Re: Every company after a data leak

      If privacy and security were absolute priorities, then this wouldn't have happened.

      I'm a pentester and it reminds me of a client that believed it wasn't possible to develop secure software, and the only possible method of making something secure was to give it to the pentesters once complete. He was moaning at the fact we kept finding things on each retest, therefore we should only test the exact things found last time so that they could get a clean report.

      1. o p

        Re: Every company after a data leak

        And YES, if professional pentesters reveal only a fraction of issues at a time, so that they are certain to find and bill new ones for the next audit, it does not help to establish trust and respect with the profession, and it certainly discourages business.

        1. MarkItZer0

          Re: Every company after a data leak

          Of course never on purpose... However, you will probably know that pentesting engagements are extremely limited in time and scope. Therefore, if we find say xss in a few locations we report those with examples, but won't go around the site finding every single instance. The client with access to their own code base is better positioned to do that.

          Time is better spent finding other vulnerabilities imo. Better a report that finds 10 different things than 1 thing but highlights every instance. Often these vulnerabilities are found manually rather than by automated scanners.

          Pentesting engagements are on average 4 days including report, so not everything can be uncovered in that time, especially on a system resembling Swiss Cheese. We don't know how much we'll find until the testing window starts you see.

          1. Joe Montana

            Re: Every company after a data leak

            The problem being if you report systemic XSS and give one or two examples, the client typically fixes the examples and ignores the wording to check the rest of their code and implement something robust. In many cases, the fixes will also be very poor - for instance i've seen a report where the example was a typical alert box containing the string "XSS", their "solution" was to check for that exact injection string.

            Plus you get other "fixes" where people completely fail to understand basic security concepts, so you find a bug like xss or whatever - their "solution" is to encrypt the form data in javascript first because encryption is the answer to everything... Nevermind that the attacker controls the client and can therefore encrypt whatever payload they want too.

  4. Chris G

    Our third party developers

    Meaning ' we don't really know what's in there, but can we sell it?'

    One would think that schools like Charterhouse and Fettes could afford a pupil counsellor.

    1. Anonymous Coward
      Anonymous Coward

      Pupil counsellor

      I'm sure they do. Even state-funded secondary schools have some arrangement in place. But sometimes children won't go to see them (percieved stigma with their peers or they don't get on with the person available) and sometimes the school counsellor will sign them off but then some weeks later there is a problem again but nobody knew about it. But there is no reason anybody should suspect what's going on if a child is fiddling with a phone. An app would provide another way of alerting somebody about a child that's struggling.

      A quick look at the STEER web site suggests a school makes the app available to all pupils and can then identify those that may be at risk of mental health problems. From experience I know that the sooner things like eating disorders and self-harm can be treated, the easier they are to deal with. When my daughter was being treated for an eating disorder and we informed school they were very surprised that she was having a problem - teenagers (and, it seems, particularly girls) can be very good at masking these things.

      Anon for what I hope are obvious reasons

  5. Anonymous Coward
    Anonymous Coward

    Insert psychobabble

    “Cognitive-affective heuristic biasing contributes to successful navigation of epistemically varied tasks in secondary school”

    “Implications are posited for how we understand the relationship between errorful knowing and wise action”

    “Containing the emotional dysregulation”

  6. Pascal Monett Silver badge

    As usual, once they are found out, they take security seriously

    "Data privacy and security are Steer's absolute priority"

    Yes, now it is, because you realize just how much your reputation is fucked. That said, you apparently only had the realization after El Reg had to shove it up your nose.

    There is no excuse for hard-coding credentials in an application and I don't care that the account has been disabled. Those credentials should never had been coded in the first place.

    I will STEER well away from your applications in the future.

  7. Claptrap314 Silver badge

    Won't someone

    think of the children?


  8. Anonymous Coward
    Anonymous Coward

    Leave it in

    Am I really the only one who likes the idea of a login that can trigger the "everything from this IP needs detailed logging ", direct it to the honeypot, send out the message "live security training, bring snacks"

  9. liz_b

    Rated 1.7

    With lots of comments on the failed security, I took a minimal look at the information on the program itself. It advertises itself as being able to track the social environment of the entire school and identify at a glance pupils who need help. Users mostly rated this app with one star on the proprietary marketplace for Android, and the sole five star review is obviously written by a mature adult, as opposed to a user.

    I've spent a lifetime in "health", from before questionnaires for mental health became fashionable, and I am well aware of the failures of the format when trying to assess mental health.

    I'm still tempted to download it and trial those passwords and see what's inside, because it sounds like "1984".

  10. 7teven 4ect

    Another win for Shi$iTeth7ae 7esting

  11. Anonymous Coward
    Anonymous Coward


    Strong password. Stored in plaintext in the app.

  12. pavel.petrman

    Re "the security needs to be as robust as the science"

    I wouldn't bet on the latter either. Just because even fewer people understand it than who understand basic sanity in software development doesn't mean that there is some inherent robustness to it.

    "Date require (...) algorithm to interpret it" - not this again. Something like my XLS files need a separately stored Excel to interpret them?

  13. Tom 7

    Good security is an anathema to management practices.

    I did try and implement a 'security from the bottom up' system. You couldn't access any data/app that could access that

    without permission, the permission being devised by the owners of the data etc.

    Turned out management didnt actually want any security really - more interested in finding out what other parts of the company were doing well so they could absorb them and claim their commission cut.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like