bash: Vi: command not found
DNS (or the Devil's Naming Service as we've heard it called) takes centre stage in this week's tale from the Who, Me? vaults: a warning of the terrors of the forgotten typo. A Register reader, "Hugo", shared today's story, which takes us back to the late 1990s, when the commercial internet was an optimistic glimpse of the …
This post has been deleted by its author
There's nowt wrong with either Vim or nvi. Install both. And stevie. And elvis. Comes in handy when you have a need to automate stuff on various platforms. For example, Vim on Apple, stevie on Windows ... The basics work across all of them, but the details vary if you're trying to do anything out of the mainstream.
case-sensitive is the only true way
Sure, I can see why you might want
Captialised_Filenames, but why would you want to have
file to exist as separate files in the same directory? Surely that's just asking for confusion?
And even if you did*, why not use the Amiga way, of allowing file/program names to be case insensitive unless you have ambiguous options (eg the
f/F from above)?
(I bet I get more downvotes than sensible answers)
* and if so, I hope I never need to look for a particular file on one of your systems.
Of course I locked down services, just a pity I missed a ] in one edit. Took about 20 minutes to work out why the whole network was down as it took a month for the error to hit. Took 5 to get it sorted and just held up my hands and put similar validation in place. God I hate networking
I learned the hard way not to use local drive mappings in Robocopy scripts. I remapped the destination folder's drive letter to another server later in the day without thinking, and I also ran the Robocopy job as a scheduled task out of hours with no monitoring or oversight, and so didn't spot the mistake whilst it was running. Oops.
The new destination server filled up to the point of failure "only" about half an hour before I got back into work the next morning, so I was able to bring it back up relatively quickly. But it still left egg on my face though.
This is why my robocopy .CMD files are on the DESTINATION, usually in "pull data from server" instead of "let server push data".
This way I can use:
robocopy "\\Source-Server\some\thing" "%~dp0\backup" /e /purge
Even when two or three backup drives change their drive letter, the script will always copy the right source to the right destination since the drive letter doesn't matter any more.
And if the drive letter is totally out of whack you'll get an error since ".cmd not found" - if you set up your task proper.
I designed an interface (web/serial) to a custom, secure, smart router product that checked everything at input time to stop any contention or misconfiguration and reported very clear help messages to the user (in this case probably a defense operative - think black suit and shades, or camouflage fatigues.)
The PHB (development) decided that the company was going to move to a Linux based alternative with dropping to root and hand editing ASCII files as the configuration system!
I got out of that company quick, but not before I had a call from one customer saying their red <-> black network 'firewall' (inter VLAN routing, plus a few ACL's, etc. IIRC) had stopped whilst in the middle of a crisis. I took great pleasure in providing the PHBs home phone number to the customer. On Monday morning I was hauled up in front of the PHB and the IT boss and the VP. I reminded them all of the email they had received from me predicting that the failure would happen soon after the first installation and be catastrophic. The VP blamed the IT boss and said that he shouldn't have to read such emails. The IT boss passed that straight down to the PHB who was in charge of development ("IT is not development") and he had no-where to pass it down to, with me having written the email.
I received a mild reprimand from the VP for smirking, "just a little too much!"
Also a nice demonstration to PHBs why they should never make major changes to an existing system just because "I think its a good idea" - every time I have heard those words, a catastrophe has always followed.
Saying that, I've had CYA strategies in place for decades now, and had to use them. One strategy is to keep an absolutely straight face during the inevitable blame-fest and shouting match.
On the bright side, if you broke it - you generally know how to fix it.
In which case you fix it, gather all the credit you can for doing and obfuscate on the cause of things breaking.
"We'll probably never know what happened, probably an intern, but I've got things all under control now."
There is so much truth to that comment. Thank you!
There is nothing quite like that feeling of ..... desperation.... when you realise that the new widget is busted, and its *your* problem to fix it....... And you know that because of a recent upgrade, and because the developers are special, Everything Is Wrong And In A Different Place, and heck, even the *(&^(*&^ scripting language has been changed.......
"even the *(&^(*&^ scripting language has been changed"
I've been running an old MUD for years, just copying it from one server to another as I replaced them. (And doing the occasional bit of coding and improvements.) Moved it from one server to another, and suddenly the output to the player was total garbage. Huh. Ok, blow away object files and recompile. Same problem. Ok, recompile on previous server - works perfectly.
After much, MUCH digging, I finally found where the output-to-player code used strcpy... on overlapping source and destination addresses. Worked perfectly fine on one machine, but was horribly broken on another. Still don't know how that was possible. One quick copy to a temp address solved that.
One of my previous PHBs had the invaluable talent of always finding an explanation that did not blame anyone actually present in the room.
It was occasionally quite wonderful to watch his 'logic' unfold. It also meant, once you were aware of it, that everyone would turn up to any meeting he attended after a crisis.
If they really had a "red <-> black network 'firewall' ", then they should have failed their next accreditation review.
Automated data flows only ever go from black/low -> red/high
Any data export from high -> low must go through a two-person manual release process to ensure that the exported information is 'compatible' with the low system. We always used CD-R or DVD-R and shredded them 24 hours after use, the gap enough time to ensure the data was backed up on the low side
Most people manually edit the zone files.
But also most do a simple check before pushing it out.
Remember most enterprises can control their zone files without too much help.
Zones tend to be relatively static. (How many companies are dropping 100's of servers a week on a regular basis? ) [Free clue. Proper planning will have DNS updated even before the servers hit the rack]
'vi' is a powerful and simple editor that every linux/unix admin should know by heart.
Emacs? Very powerful, but could be problematic w a security hole, depending on how it was installed.
Mine's the one with a microfiche of Stallman's paper on Emacs from MIT, in the pocket. ;-)
I have always been fascinated that the DNS environment required multiple files to host really similar data, you have forward files, and reverse files, and they have to match.
Reminds me of the famous line in Ten Things I Hate About you:
"Chastity: I know you can be underwhelmed, and you can be overwhelmed, but can you ever just be, like, whelmed?
Bianca: I think you can in Europe."
Forward to reverse DNS is not a one to one mapping, and the One Chosen reverse entry may not even have or need a forward one, if it's just some in-band documentation. If'd be a scaling nightmare if you started with one file that pointed x.com and y.com at 22.214.171.124 and set a reverse for 126.96.36.199 just as host73.examplehosting.com Then any other domains in y.com go in this file, and x.com, and all the other reverse for 1.2.3.* and also for any other reverse needed for other names under x.com and y.com, plsu of course everything else related to examplehosting.com You'd just end up with your nameserver taking this sprawling mess then internally creating and segregating each hierarchy. DNS doesn't make any need for forward and reverse to match, though many services may place value on them corresponding.
On one of my old customer boxes, bind's validator borked and the box was nearing EOL... I never fixed it. Just a shame the box was multiple domains' primary NS. A few nail-biting occasions where I realized just in time I'd syntax errored - I'm now exceptionally methodical with zonefiles. And everything gets bonus rsnapshotted daily!
A global ISP handling their domains for thousands of users with scripts and vi?
Sorry, but that's atrocious no matter what generation that was.
And the unforgiveable bit: No backup or history? It sounds like they just jumped into a single live config and went poking for the solution. Not even a "restore from yesterday", even if they wouldn't have worked? Because even "Restore from last week? No? Restore from last month? That's working? Cool!" would have got them back up without that kind of manual intervention and then they could go comparing files to actually see what changed.
If this was the 70's or something, and the tools simply not available, it's still not great. But the 90's?
As it seems to have been a Unix environment, the tools were there long ago.
Back in the 80s I used to create scripts that invoked various bits of SCCS (together with vi) to provide change control on pretty much every config file across umpteen systems.
With makefiles to look after testing, rebuilding etc. for the more complex ones.
Fiddly to set up, but life is calmer once it's done.
Bollocks! In the right hands and with the right tooling, scripts and the occasional bit of hand-editing is just fine for managing Big DNS. That's how it still gets done. The registrars and ISPs who manage zillions of domains use a back-end database and write their own SQL scripts or whatever to generate their zone files and name server configurations, Most TLD registries do this too.
You can arrange for those tools and scripts to perfectly fit the organisation's IT operations, processes and procedures - trouble ticketing, change control, backups, testing, support handling, etc. That isn't possible with bloatware enterprise DNS "solutions" and crudware IPAM systems. With these you change your processes to fit what these piles of shit offer. Which usually isn't much - apart from a glitzy UI which impresses the IT directors who will never have to use it.
Another problem with enterprise DNS and IPAM systems is you end up with someone who knows how to drive these heaps of cruft - if you're lucky - but knows fuck all about DNS. Or how to configure and troubleshoot a name server.
BTW, DNS wasn't around in the 1970s. And neither was vi or emacs. The 70s and early 80s Arpanet used the hosts.txt file.
"DNS wasn't around in the 1970s. And neither was vi or emacs. The 70s and early 80s Arpanet used the hosts.txt file."
Not as DNS, no (that was 1983) ... But there were (localish) implementations of name to numbers translation that mostly worked.
I think you'll find vi & EMACS were both born in the same year, that year being 1976. I remember swearing at Bill Joy in person over vi, and at rms via email over EMACS, both well before 1980.
The first copy of Jake Feinler's file that I received from Jon was called HOSTS.TXT ... yes, all caps. It was renamed to hosts a couple years later. The Hosting Name Registry started in 1972.
"A global ISP handling their domains for thousands of users with scripts and vi?
Sorry, but that's atrocious no matter what generation that was."
one of the points of unix is to use scripts & amend config files to get stuff done.
i do prefer a fancy GUI but mainly so i can see what other options are available when i'm trying to do something. once i have my work flow i could just as easily do repeatable work via a script.
Restore from yesterday would not have been that helpful in this case for obvious reasons. Restore from 2 weeks ago might well have clobbered other good changes and introduced further problems. You should absolutely have backups but they would have been of limited help in this situation.
As stated by others. Checking config files into a source code control system might have been useful to identify changes. I'm still surprised how badly this is done today. Back in the 90s, I doubt many were doing it.
Source control for config files goes way way back. I was doing it in the 90's.
Just because git is only a youngster and it's the devops culture now, doesn't mean other solutions weren't around long ago. My favorite for the period was RCS (released in 1982), and I still to this day have some config management solutions writen around RCS (exclusive locking for the win for system configuration files, not so great for source code).
SCCS was around earlier (1972), but not as wide spread as RCS and later CVS got.
People just reinvent the same thing over and over again, shift it around, and call it the all new revolutionary way.
Things were changing so fast back then that it would have been near impossible for a software based solution to keep up. Remember, small garage-based Mom&Pop ISPs were going from startup to "global" in a period of months. Business evolution was out-stripping the technology. Manually adding what was necessary worked (mostly), so that was how it was done. It wasn't right or wrong, it just was.
We've moved on. I'm not all that certain things are better now.
During the years I was programming in Coral 66 I had several comment problems.
'comment' comments must end with a semicolon;
But it was easy to omit that terminator, especially if you were used to other programming languages. The result was that the next statement was treated as a comment until its final semicolon. I.e. it was effectively absent. This could cause mysterious problems.
I had a pretty-printer program for Coral that I had written, and it was the pretty-printer which finally revealed the mistake: showing the statement as run-on text at the end of the unterminated comment.
For a couple of years I was employed to go into C programming tutorials and say "have you tried putting a semicolon at the end of that line" at students when the Borland C compiler barfed with a "missing semicolon at line...." error because they'd missed out the semicolon. One week I covered for a mate that was doing ML tutorials, the first three students were really confused.
What with variously using C, fortran, R, python, bash, matlab, and a few others, I now have almost no  absolutely reliable memory of any computer language syntax at all. Is it "for" or "do"? Where are the brackets? Are there brackets? Is it "in" or "="? Wait, what sort of brackets? Etc, etc.
 Exaggerated for comic effect. Honest! :-)
The joys of statement terminators
*the classic bug is where you have "return<newline><somethingVerboseLikeAnObjectLiteral>;<newline>". It will return after the first newline. In the absence of proper function signatures, this will only be manifest if the calling code fails in consequence of the undefined value returned.
....... and results can be brutal.
Mindful of his own brush with a pink slip, Hugo upped the paranoia of the tool from merely warning of errors to issuing a full-on stop when validation failed.
Such considerations have moved on and into other fields of endeavour peddling and pimping/pumping and dumping content to/of the masses. And for probably pretty much very similar reasons .... maintenance of status quo conditions, no matter what the perilous state it may be in. But it is no fix, nor even an effective tool whenever wielded, for it then has invariably revealed a systemic weakness which the program/platform has scant defence against?
The following is a current example of the tool type/program gripe ......
We are unable to post your comment because you have been banned by Slugger O'Toole. <u>Find out more.</u>
I had a similar experience. I was looking at a config file, with a line I wanted to comment out temporarily, but not sure whether this format supports comments and if it does how to use them. My basic thought process went like this:
It looks like XML, so maybe I can just enclose the line in <!-- ... >. That definitely makes the most sense. Except it doesn't start with an XML tag. Also, after a few levels of XML-style tags, it starts listing key value pairs without tags, so it's probably not XML. I'm also pretty sure that it takes multiple values separated by semicolons, so it can't be that. That only leaves #. Well, I think it uses semicolons for the multiple value separator but I can't check because this file doesn't have any sets of multiple values. In the end, I just copied the file and deleted the line. It was the fastest way to be sure.
Just to add to the hilarity, I recently did some work on code that handles Scala tuning and keymap files. The start of line comment character for that is a "!". Unusual enough to start with, but with the low res monitor I had to use at the time it looked like one of these "|"... much hilarity - Grrrr.
I remember a corporate edict that said all programs we ship must compile/assemble with return code 0. No warnings. There were all the usual moans about wasting time etc. We fixed all the warnings, and it was surprising the number of little/intermittent problems that just disappeared.
As one developer said "I was always suspicious of one bit of code, but never had time to fix it. I was glad when my manager came and told me to fix it"
I had a junior programmer whose take on compiler warnings was to disable them. Needless to say the quality and stability of his code was appalling.
It took me a while to have him fix all of his code, listening to all his (stupid) complaints like, for example, whether it matters or not if a variable was initialised before use as long as the code (sort of) works.
What was his name? Schrödinger?
(the variable remains both initialised and uninitialised until the value has been observed).
You were waaaaaaay too soft on him. Sloppy behaviour like that is selfish - you're only passing the problems on to others. I would have verbally, possibly even physically, kicked his foolish arse.
Code that emits warnings is, to borrow from John Cleese in Clockwise, a discourtesy to others.
When people build your code, they are going to see every warning it generates; and if they downloaded your program as opposed to writing one themselves, they probably are not going to be as aware as you are, which of those warnings are safe to ignore and which ones mean anything.
For instance, if you've got a variable that apparently is defined once and never accessed again, another programmer might reasonably suppose that has something to do with a build-time option they didn't enable; but then, why didn't you just move it inside the relevant #ifdef where it belonged?
Have you so little hubris, you don't even care about your build process looking ugly?
Some time ago (prior to the turn of the century) I was writing a lot of diagnostics for a product (using an SGI Indy - fun times)
One fine day, I checked out the latest branch to address a requested feature but when I tried to compile, I got many screens worth of errors.
Turns out that the previous checkout (by users unknown) had loaded the makefile into their editor which converted tabs to spaces.
In that version of make, tabs vs. spaces had meaning so I had to get an older version of the makefile and check it back in.
That made for an interesting morning.
> Turns out that the previous checkout (by users unknown) had loaded the makefile into their editor which converted tabs to spaces.
In a past life, I was working in a team where some people used tabs and some people used spaces[*]; one developer in particular made a big fuss about sticking with their preferred standard.
The repository manager dealt with this in a suitably elegant way; they wrote a script to convert between tabs and spaces, and set up the CMS to run this script whenever this developer pushed or pulled code.
Kept everyone happy :)
[*] I can't actually remember which side of the argument I was on. I think at the time, I was using tabs and vertically aligned braces; a decade or two later, I generally favor spaces and diagonally aligned braces...
If it's a Makefile, its got to be tabs. Spaces for leading indentation just won't work, so its easy to see which side to come down on.
Almost all modern editors support editorconfig either directly or with a plugin, which is a good way of expressing these rules in a repo wide context that will apply whatever editor the user prefers.
Admit it, you've still got a running terminal window somewhere because you can't quite figure out how to close it.
(For the record, I prefer nano. It just works, though I refer to myself as a developer not a programmer so I'm not sure if I fall into the 'real programmer' camp or not)
It's worth learning basic use of the common options - unfortunately some editors that "just work" in /that/ way don't offer robust crash recovery (TBF I can drive nano if I have to and I'm not aware of scare stories with it, but there are definitely others).
I see http://heather.cs.ucdavis.edu/~matloff/UnixAndC/Editors/ViIntro.html recommends 'ZZ' here, which I was warned isn't good to teach newbies as it invites ^Z^Z by mistake, which risks giving the impression of having worked but without having actually exited *or* saved. Chaos also ensues :/
...I was pointed at a 'learn' tutorial and produced a "concepts and controls" crib sheet for friends at least once. Top tip probably "don't think of Esc as 'enters command mode', think 'Esc ends an edit'" ["how do you know you're in a lab full of unix newbies? *beep* *beep* *beep*]
> Top tip probably "don't think of Esc as 'enters command mode', think 'Esc ends an edit'"
It was a few years after I learned how to use it that I saw someone's comment about Vi, to the effect that you don't edit text with Vi, but instead send a command to Vi, and it does the editing for you.
E.g. "delete three words", "copy the current line", or "insert the following text at the cursor: once upon a time in a galaxy far far away...".
I do wish someone had said that to me at the time when I first encountered The One True Editor, as it would have made things a lot easier to understand!
(Though modern Vim on Linux is a lot more user-friendly than Vi on SunOS/Irix was back in the day, and it does act more like a realtime editor!)
The first time I encountered vim I found it was lying to me which I thought the exact opposite of user friendly. It had been set up to hide Microsoft line ending ^Ms. I'd just fired it up specifically to edit those out of a file. I've never trusted it since.
> If it's driving you too batty, :set noerrorbells.
Useful that you can, but I like the error bell (as opposed to the visual equivalent I saw some people favour).
In a lab full of newbies you can't turn off the other people's beeps, of course. I did find out you could change the pitch, and did so to make my warnings (from whatever source) stand out to me that way.
"In a lab full of newbies you can't turn off the other people's beeps, of course."
The hell I can't. I'm the teacher, and I have root.
The easiest way is to add the command to the global rc file.
If you don't want to do that, you can make a default framework for the students ~/rc that reflects the command (you are the teacher, so having this set up before they get there is a no-brainer).
Note that the name of the rc file varies with the version of vi that you are running. But you are the teacher, so you already know this, right? If you don't, try reading TFM page. It's part of your job description to know these things.
Suggestion: Do this on the second or third day of keyboard time, after allowing them an hour or two of beeping, which helps train the fingers. Tell the students what you did, why you did it, and how they can change it back for their personal login, should they wish.
> For the record, I prefer nano. It just works
BURN THE HERETIC!!!
TBF, I used Nano (or more precisely, Pico, back when it was still the built-in editor for Pine) back when I was but a fledgling. After all, it's pretty much the *nix equivalent of Windows Notepad.
But when I got my head around Vi[*], I never looked back!
[*] Admittedly, I did this by printing off a page of commands and spending a day forcing myself to use it; the air turned blue and the Esc key on my keyboard got hammered into near oblivion as a result. But once it clicked...
Pine Is Not Elm
I'd been using a Unix box which had the Rand editor but no vi before I found myself in my first real IT job. At the end of the first day I rushed off to Dillons. I've still got the book in my bookshelves but it hasn't been opened in the last couple of decades or more.
Ahh, e. These discussion brought me back to IBM, circa 2003. 600Mb log file, I needed to do a search. At the time, I was of the view that emacs was cool. OM. Okay, vi. OM. ??? ed. OM. ... Wait, wasn't there some editor called, "e"? Yep. And it was the only one to only page in what it needed of the document being viewed, apparently.
Silent minus-to-emdash translation by $EMAILAPP recently got us when emailling code snippets.
The option to mail in plain text is buried deep in the menu system (and seems to change location over the [many, frequent, likely pointless] updates) and it has to be off by default because it messes up the corporate email chains.
>  Yes, was. It's been redone from scratch since then. Still the same project, though.
I've replaced the broom handle twice and the bristles a dozen times, but it's still my broom...
(All respect to those who maintain and improve things like this though - it's a fantastic effort!)
As a naive young System Administrator in the 19somethings I once made a minor, I mean really minor, soo trivial, could hardly see it, honestly it was a minuscule not quite correct edit of a line on my work Unix system.
The result was the server (including the then 'Yellow Pages' master server) would not boot. So I had to edit a line using Edln in real time while it was being overwritten by the system. I had to set Edln to the line preceding the one I needed to edit, and could not see the edit as it happened. I just had to get it right.
That was a tad stressful for a Sunday.
Good old Ed.
I once used vi, but we had Sun workstations so didn't really need it.
Showing my age but reminds me of the days when getting new games for the BBC Micro meant a trip to John Menzies, the purchase of a magazine, hours of typing and many more hours of swearing and using a magnifying glass to confirm if it was a comma or a full-stop in the line that causes the game to crash.
Our first home computer was a Sinclair ZX81, which had tokenised input, meaning that each keyword was a secondary function of a single key (see here). I vividly remember a program not working, in spite of looking identical to what was in the magazine. It turned out that I had entered "<>" (not equals) with two tokens, having not noticed it was available as a single one.
The worst part? It was was my mum who figured it out! The embarrassment!
I think that one of the biggest Clues that an experienced coder can give to a neophyte is "Ask for help proofreading your own code!" ... Most people see what they intended to write, not what they actually wrote.
As a side-note, back in the day my Wife, who is not a coder, not by any stretch, could often spot syntax errors in my code because it looked "out of balance" (her words).
Hours of typing? I can go one better. In NZ when I were a lad, early on a Sunday morning they used to broadcast a program over AM radio. Hit Play+Record when the silence started, put it in the cassette drive once complete and and
watch the parity errors shred your dreams enjoy the free software. I had an MSX machine and we only got favoured every couple of months, normally it was Speccys, C64s or BBC.
I had a friend with one of those, I was always a bit jealous of it. Mind you, I started out on a hand-me-down zx81, obtained when some relatives upgraded to a BBC B; my eventual setup was a B with a 6502 co-processor. I don't recall whether I ever tried the radio recording thing, although it does sound familiar.
I tried firing up the beeb a few years ago for old times sake ... and then "Bang!" went a capacitor.
Spectravideo 328 it was. Great BASIC interpreter, but no games. Every time I came back from my mates place with his C64 I'd sit down and try and write them myself. I turned out a reasonable Summer Games as I recall, reasonable enough to break a few joysticks on the 100m sprint. 30 years on and am still coding daily - I suspect I have the lack-of-games to thank for that!
Have had some brushes with typos in text files that are automagically imported.
Though not the vital service killing kind.
Once uppon a time (early noughtys) I was working on / inherrited a web service to present financial data. This data was entered using a dos "application", exported to a CVS like file, using | as the only delimiter, passed on to my system over the internjet, and then imported into my system. The closest thing this had to any kind of validation, was the db schema.
Every now and again, someone would misstype 1 as |. And this would not be overly visible in the DOS application. Export would not be troubled by this at all, file transfer would naturally not notice at all. The import would fail with a note to the logs, and if I didn't notice the logs, the client would complain about stale data after a couple of days.
When the logs were showing error at line 3'254 I coule easily use * text editor, and fix it easily enough. However, when got to complaining about line 325'400, I learned to love SED.
I once worked on a system which originally didn't report the error line when an import failed. So one of my colleagues added an error handler to take a file which failed to load and do a binary chop to load smaller and smaller chunks of it to eventually isolate the line with the problem.
I think that may be a bit exaggerated. If there was no automation, that means that there was a lot of work and probably not enough people to do it, ergo the ones doing the work were stressed and had many things on their plate at the same time. On top of that, there was apparently more than one person with authority to change critical files - a sure way to accumulate big problem-making errors.
It's easy to make mistakes in an environment like that.
Yep, Perl has Net::DNS. Allow your admin workstation IPs do dynamic DNS registrations in bind, then use Net::DNS::Update to do 99% of what you need. Which is typically just adding or deleting the same old handful of RR types.
Since you're sending/receiving actual DNS packets on the protocol layer, you don't get syntax problems in your hosts file - a "bad" record simply won't get committed. If it's successful, you get a reply packet you can (should) check for. The zone gets incremented etc just like any other dynamic DNS registration.
Obviously you can write script wrappers to do your typical things, like do a bit of syntax-checking of the input. Or create a PTR every time you create an A record in [zone], as long as a PTR doesn't already exist. Or, if there is a PTR, display what's there, prompt for a change, etc etc. If you have a split-brain zone or multiple name servers, a script can update them all at the same time.
The rest of Net::DNS will do general DNS stuff like various RR queries (with a baked-in method of sorting multiple results if you choose, sensibly according to RR type or a custom sort), zone increment, zone transfer, and so on.
Saved my bacon from the late 90s up until less than a decade ago, when I stopped working with bind, across multiple orgs and countries. I think I directly edited a zone file less than half a dozen times (after configuring the environment so I could use my scripts) during that time.
One place in particular could take days to turn around DNS updates (very complex environment) and had errors on at least a monthly basis - with decent scripts, all those problems went away. Certainly no more "fat finger" incidents corrupting the zone files.
Incorrect records being added/deleted, well, you can't entirely fix the GIGO problem - but throwing errors if some idiot tried to enter an IP for an invalid range was certainly helpful.
job destroying typo was the simple rushed through edit on notepad.
Then loaded the robot with the resulting code.
And hit the go button thinking it was all good
2.3 seconds later it punched the arm through the splash window
Damned X and Y s all look the same to me (typed in X+250 instead of Y+250)
Many years ago the network team where I worked had a database of assigned IP addresses and hostnames. So it was a simple matter to write scripts to turn those into DNS database files (aliases were done by hand, but changed much less frequently). The script was run when required, showed the changes and asked for permission to put them into practice.
It clearly worked too well. No problems for a year and then one day half the data was missing!
Turned out that one site's data file was missing, which led to a lot of changes, but most had scrolled of the top of the user's screen, so they just let it go, as it had always worked in the past.
The script was quickly changed to display the number of changes and require an explicit unusual input of there were more than ~50 changes. The missing data was restored even more quickly.
Oh yes, I have seen this many times before! During a critical update that had been delegated to someone in the 2nd line support team, they had updated the DNS Zone file and accidentally added a hash instead of semicolon. Cue the phone call to myself at silly o'clock advising they deployment is not working and they are not sure what the problem is.
Eventually, I awake and identify that there is a hash on the prior record that was wanting to be commented out, however, in this case, there were also hidden characters as it had been amended via windows so also had to run dos2unix on the file and then re-update the file. I don't miss being woken up at silly o clock for these type of things :D!
Some years ago I worked with a telco that was rolling out a single rack unit that would replace a whole cabinet full of modems. This was specifically aimed at the large ISP's who had rooms full of racks, each one stacked with 9600 baud modems.
As we were talking to them a network engineer came in to the room and calmly powered off a whole cabinet, waited 30 seconds and powered it back on. When we innocently asked what he was up to, fearing our kit would have a hissy fit if he did that to it, he explained. One of the modems had got stuck and wasn't connecting properly. It was too hard to figure out which one it was, so he narrowed it down to the cabinet and simply power cycled all of them.
And all the other people copnnected? Don't they get disconnected? The reply was priceless: "Well, what do they expect for just a tenner a month?"
Biting the hand that feeds IT © 1998–2021