How much effort does it take to copy an SQLite database file?
A program we had written by contractors (of course) has data in an SQLite file on an embedded system; this needs to be transferred, in its entirety, over a LAN to a desktop PC. Instead of using anything tricky, like (T)FTP, it made far more sense to them to:
SELECT every record, extracting the fields one by one into temporary variables, convert some to strings, concatenate these strings together, copy this lot onto the end of temporary buffer (re-allocating said buffer everytime it fills), pass it to a home-grown network layer that copies the data into new buffers; rinse and repeat. Finally, a lump of data is sent over the LAN for the receiver to reverse the process and create a shiny new SQLite file.
Cold reading the code and estimating the amount of data being manipulated in the sender, I gave up when aproximately 100 KiB of SQLite file had already turned into multiple TiB of mempy()'s. I thought I'd made some obvious mistake in the cold reading, but running this, you are shown a "Loading..." message for a minute or so, in the middle of which is a very brief burst of LAN traffic...
To be fair, I haven't checked some of the fine details; perhaps the sender and receiver are using different SQL schema and this seemed like a good way to get the data re-arranged...