back to article US small biz loan system bans software robots. The lesson? Make sure IT knows about any automation projects

There are times when more automation is not necessarily a good thing. Just ask the US Treasury or the country's Small Business Administration (SBA), which banned a few software bots last week. The two government agencies have been working on a "Paycheck Protection Program" to hand loans out to smaller American businesses to …

  1. Boothy Silver badge
    Mushroom

    Always involve IT, even if it's just as an FYI at the start of a project

    Quote: "The lesson for anyone thinking of deploying RPA is that they must involve IT in projects early on. Business teams thinking they can use RPA to get automation without troubling IT will find it is a false economy, said Neil Ward-Dutton, VP of AI and automation practices at IDC."

    This goes for many other things, not just for RPA.

    Oops, this ended up longer than expected! Sorry.

    Not RPA, but in a similar vane. A good few years ago (early 2000s), the company I worked for did a project without our (IT) knowledge. I was one of the techies looking after what was basically an Integration platform. We took in customer data in various formats, CSV, EDIFACT, XML, TRADACOMS, that came in via Internet FTPS, dial-up (UUCP and Kermit!) converted all this to formats the internal systems could handle (XML for the newer stuff (at the time), good old fixed-width for the IBM Mainframe we had).

    The peak data started mid afternoon, ended late evening (end of day stuff) and the majority of the clients (with a very few low volume users as an exception), did everything as batches at the end of the day, hence the peak being when it was. So we'd typically get just one or two files from each customer, but each file would contain 100s to 100,000s of data items in each file.

    As such the platform was tuned for batch processing, and all the internal transfers and back end systems were set up the same way. the expectation being few files per customer, but with many records in each file.

    Anyway, my employer had outsourced some of it's IT to a certain company who's name begins with the letter 'F', and they'd designed a web app a year or so earlier, that customers could use, instead of baking their own system (aimed at small and mid-sized customers), and this system simply appeared as just another customer to us. The fact this was 100s, or 1000s of customers behind it, didn't really matter to us. They batched up all the customer data together every 30 minutes or so, and sent that through to us. All was happy with the world.

    The 'Business' decided they didn't like this added latency due to the batching being done on Fs' web platform, (as it delayed when the data turned up on back-end systems), so they asked them to change it to near-real-time processing instead. This they implemented one weekend, without informing anyone in IT or any service managers or owners.

    Everything was fine till about 10:30 on Monday morning, when one of my colleague noticed there was some lag between data arriving on our system, and when it hit any back-end system, and this lag was getting gradually worse.

    I jumped onto the UNIX platform where it lived, and had a look around, and found a working directory that was used during a batching up process had something like 1,000,000 tiny files in it, when we'd only expect to see a few hundred larger files, at most.

    We eventually figured out the web interface created by F had been updated, to generate a 'batch' file for every single item of data being generated (millions a day).

    Worth noting at this point, that each batch file had two headers and a trailer. So as an example, whereas a single batch of 1000 items of data would have had 1003 records in total in one file, this now meant we had 1000 individual files, each with 4 records, as each needed their own headers and trailers! Resulting in 4000 lines to process, rather than the original 1003. (This also broke the Mainframe, as it created jobs based on headers, so 1 old job, became a 1000 new jobs, the MF team were also not happy!).

    Basically we ended up (in total with other customer data) with something like 100 times more files than expected, plus around a three fold increase in overall volume of records. The Integration platform was already running at around 95% utilisation during peak hours (about 4 hours a day), poor system didn't stand a chance. (It was still working, just not fast enough to keep up, as the backlog increased, the slower it got!).

    The only initial work around was to close down the feed from Fs' system, move all the backlogged data out of the way, and allow new stuff to come though at its regular speed. We also manually pushed through all the data from other customers (as this was still in batches), so at least it only impacted this one source (although this one source accounted for a large portion of daily volumes). It was late afternoon by the time we got this done.

    The 'Business' had to go back to F and get them to backout the change, as none of the systems could cope with it. Massive egg on their faces, at least internally, as it turned out they'd been selling this near-real-time service for a while and this was their grand launch! No doubt we (IT) probably got blamed behind our backs for it not working!

    All they needed to do was ask one person in my team, "What would happen if we changed the batches to this?", and anyone on the team could have predicted the outcome with ease, and saved everyone the all the wasted time, effort, lost revenue etc. and perhaps even come up with a solution for their business need!

    About a month later the change was re-implemented. This time after engaging with our team, where we developing a solution and even tested it before go live! Worked perfectly 2nd time round. Go figure!

  2. d3vy Silver badge

    I was working on an RPA project a few years ago where the target of the automation got wind of what we were doing and said that it was against their TOS.

    We pointed out to them that our options were RPA with about 25 concurrent sessions hitting their service 24 hours a day or we were going to have to supplement the team we manually had doing the task manually (85 people at the time) with a few hundred off shore workers which would mean 300+ concurrent sessions.

    A balance was met where we implemented RPA but with some small delays build into the process to make it more manageable from their side, but still faster and cheaper for my client than hiring a bunch of humans to do the job.

    The headline holds true - speak to IT before implementing anything!

    1. Boothy Silver badge

      Out of curiosity, did the target organisation not provide an API? Seems odd (to me) that if they were expecting this level of user input, that they'd provide an API instead.

      Or if they did provide an API, why ever use RPA in the first place?

      My general thought would be always use an API, no API we take our business elsewhere, but of course context is everything!

      1. d3vy Silver badge

        "Out of curiosity, did the target organisation not provide an API? Seems odd (to me) that if they were expecting this level of user input, that they'd provide an API instead."

        That's the beauty of the sector my client works in.. the systems they needed to interface with are basically monopolies (Regional utility companies).

        They created the software years ago (In one case I was using RPA to work with a site that claimed to require IE5.5).

        These companies don't make profit from this software, they have to provide it for regulatory reasons but that doesn't mean they need to provide an API and that would cost them money, so the majority just don't provide anything other than a half arsed web page.

        "Or if they did provide an API, why ever use RPA in the first place?"

        Quite, The system that we created was capable or interacting with APIs where available, but out of the 20+ utility companies that we were interacting with, only one had an API and even then it was in beta and not suitable for actual usage.

        So, yeah if there was a competitor available with an API that would be great, the reality is there are no competitors that we could have used, still cant complain, Its quite a niche area and it pays quite well when the work comes around :)

    2. Dvon of Edzore

      Did something similar, with same answer. Needed to query a supplier's massive order system to see if preferred product was now available to replace less-preferred items in parts order. In theory there was a built-in function to do this but it was run during off hours so items might sell out before the batch ran. Had to write a terminal emulator and screen scraper to handle odd formats built for human use, matching up available inventory with list of preferred and unacceptable choices. Fullscreen pages were often drawn somewhat randomly, not line by line, so had to maintain a virtual screen in order to read the result. (Before you ask, the only API was for their suppliers, not customers, and expected to talk mainframe to mainframe. Everyone else pretended to be, or actually was, a teletype.) Successfully simulated the fingers on the keyboard and received no complaints from big supplier. The program ground away happily until merger with biggest rival ended a lot of things.

      1. d3vy Silver badge

        @DVON

        Yeah, I've worked with quite a few "green screen" systems, slap a web front end on it to let the users carry out the most common functions and you save a fortune in training then how to use a terminal.

  3. HildyJ Silver badge
    Flame

    Greed

    The RPA services are being used by banks to bypass SBA rules.

    SBA designed their system (quite rightly) for a truly small business with no legal or IT staff. It was intended to be used by a small business owner.

    Banks and corporations saw a money making opportunity with the COVID-19 relief bills and flooded the system with applications.

    If I were the SBA, I'd limit input to 60 characters per minute.

    1. Jim Mitchell
      FAIL

      Re: Greed

      The PPP was setup such that banks were the gatekeepers. Businesses could not apply directly, but had to go thru a bank.

  4. Robert Not a Robot

    The Real Problem

    There are 3 actual issues that are not addressed here:

    1. Visibility and communication through out the entire end to end process. While Neil is correct, you should and must involve IT to be successful, even if the lenders had spoken to their own IT departments, which the story does say one way or the other, the issue is the overload of the SBA system which is not actually a part of the lending firm. It is because they did not communicate within the entire supply chain that this issue was created, not because they didn't involve IT.

    2. Governance and Controls - Just because you can automate something does not mean that you should. You need to understand how, where, and why any form of automation is being performed within your process architecture. What will be its impact to the process both up and down stream. Without doing so is tantamount to Robotic Process Anarchy not Automation.

    3. Contradictions - It is interesting that all the RPA vendors essentially put the blame on the SBA. Again Neil is correct that you should not implement RPA without engaging IT. However, that is a direct contradiction of how the RPA vendors typically sell their solution. They promote the "citizen developer" and the lack of IT necessary. To start saying you need IT would slow their implementations and more importantly their sales cycles. In this case those lenders, like so many others just bought into that line.

    This sounds like my next blog post...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020