
Recurse!
If the drafter is just a smaller model, could they have an even smaller drafter for the main drafter?
It's turtles all the way down...
411 publicly visible posts • joined 8 Jul 2009
What the article said is that the communication is only between the controlling locomotive and the FRED. The FRED is only on the last car of the train. So, only that one radio link needs to be updated. What I don't know is how the FRED actually controls all of the freight cars ahead of it. My guess is that its the same way as the brakeman in the caboose/brake-van did - open the valve on the air line, letting all the air spew out. With no air pressure, all the brakes in the train will activate - eventually.
My understanding of this comes from being a model railroad fan, and so watching lots of YouTube videos. At least here in North America big trucks are the same setup - let the air out and the brakes all activate. You have to connect the air hoses before the brakes on a trailer will release.
The days of manual brake wheels on top of boxcars, etc. are long gone.
Good progress. Even better would be more progress.
Also, I've thought for years that direct thermal power generation would be a godsend. It already exists, but not for practical use. The article here mentions that the "CHESS" stuff works for it too. Here's hoping that that aspect can be made practical.
I like the step counter and the heart-rate monitor. But you can get step counters right on your phone, I think. So, I'd say it is mostly psychological - having a thing on your wrist that you know is tracking stuff could be a subconscious prod to *do* something.
Also, you might be a weirdo who likes to change the watch-face every day.... :-)
I've stuck with the main face on my current watch, since it has the most info right there, plus a tap to bring up the phone call stuff, and one to bring up heart-rate monitoring right away. Darn thing often gets the outdoor temperature wrong though - that's the fault of the app on the phone, however. Oh yes, they all have different control apps. :-(
The things are cheap enough on Amazon that you can afford to do just that. You'll have to fiddle with your phone settings to switch to a different watch, however. Just go with some generic Chinese brand - you can get smartwatch's for as little as $15, and the quality seems to be pretty good. Just think about your requirements - for example, do you require it to be able to answer/make phone calls through your smartphone? The $15 one doesn't have mic/speaker so can't. Slightly higher priced ones can. All do heart-rate tracking, step counting, blood oxygen level. All of the ones I've encountered have far more exercise/fitness stuff than anyone needs. The $15 one tries to do blood pressure monitoring, but I think it was reading high for me.
Ah yes. Long, long ago I took a somewhat-longer-than-usual Christmas vacation with my parents. They of course had a windows computer, not a proper Linux computer. My email then was on my own domain on my home computer (gee, just like now!), and I didn't attempt to allow email access remotely (POP/IMAP). So, I exported my usual session over X. I put a Linux distribution on a USB stick and booted the parent's machine from that. Slow, sure, but access to the X-session at home was all I needed it for. Made my stay much pleasanter. And yes, I only did it a couple of hours a day.
Where do I upvote the article? :-)
A bit off-topic, but triggered by Liam's discussion of menus, etc.
Gnome has no title-bars? Yikes! No wonder I expunged it long ago.
I'm a person with a horrible memory for raw facts. The main reason I like menus (of any kind) is that they LIST OPTIONS. That can readily remind me what the names of the tools I need are. I'm afraid this is especially useful on Linux where geek programmers invent weird and hard-to-remember names for their programs/tools.
I spend the vast majority of non-browser computer time with a shell window and an emacs window. The "search tools" that various systems have are often useless to me since I don't remember the name of the tool I recall using - and I also might not recall the proper search terms to find the tool by function.
I've used Firefox for a long time. Something different back in my Amiga days, but I think that's about it. I'm not a heavy browser user, although it is open for a good percent of the day. The one significant plugin I use is "NoScript". I block things like tags.google.com, and anything from a social media site. Google searching works fine, as does YouTube - I even have a little-used channel there. I normally have only one tab open (two now - one for the comments, one for main El Reg) - speed has never been a problem for me.
I hit the website incompatibility issue with the government's healthcare portal which I needed to use. That caused me to buy a Win-11 laptop so I could access it. Also need the dam Teams viewer tool occasionally.
Thanks for the link to your older article, Liam. I saw the amd-64-v3 and had no idea what it meant. A google search was not enlightening. The cpuinfo command on my old system includes
CPU family: 21
Model: 1
Model name: AMD FX(tm)-8150 Eight-Core Processor
Stepping: 2
none of which is a "version". I did push harder on Google search and it popped up an AI reply that said the V3 comes from the Ryzen 3rd generation cores. So, my perfectly good computer is not new enough to run these Linux versions - FX family is too old.
I'm with you on that! I was in the middle of software development when they went under - working on an A3000. I thought about it a bit and then went and bought an A4000T. Worked with that for several years before getting a PC and Red Hat Linux - I *really* didn't want to have to change at that point, and I was employed, so could sort of afford it.
OK, you've prodded me enough.
They keyboard I'm using now says "Logitech K200" and it has worked fine for me for several years. The main problem is that some of the legend paint has worn off. I'm a touch typist (staring at the screen now), but sometimes the XCV bunch gets me.
I bought a mechanical keyboard a few years back at a local computer-ish place. MSI CK S??? (argh - the label is badly damaged). It was 5% off since it was opened, but looks fine. Says it has Cherry red keys. I'll try it later today. If it fries my computer its your fault!
This would be perfect for me if I didn't have real work to do! Before Linux I was heavily into Amigas (1000, 2000/2500, 3000, 4000T) for both games (not seriously - I'm terrible at all of them) and development. I settled on Mate after Gnome went bonkers because I want just one tool, etc. bar that I can put on the right side of my portrait orientation monitor. And, it has everything via menus, so I don't have to attempt to remember function keys, swipes, etc. A release or two back on my current system I had UAE up and running so that I could re-play my AmigaMUD and Explore/Amelon games. Pathetic, I know.
I wish!
I've been a Linux user since way back to the original Red Hat. The local government has medical information online now, and I wanted to be able to access it. After going back and forth with their tech support, it was determined that their site does not support browsing from Linux. I could probably find out how (if its still possible) to make my browser fake the identification string. Instead, I bought a fairly cheap laptop with Windows 11 (I can't count how many times I yelled at it for popping up stuff that wanted me to spend more money! Or just didn't seem to want to work reasonably (by my standards)). Also I hate the chicklet keyboard. The only good thing about it is that after I put WSL2 on it, I have another test bed for my compiler. :-)
Way kewl! Brings back memories of playing with old Teletype machines, etc.
That Flexowriter is much smaller than the big desk teletype we got from a seller (government surplus??). That one came on its own steel desk that you could park a car on. The solenoid for the paper tape punch could probably have punched a hole through your hand if you could get it in the wrong place! The reader was a whole separate unit that you could slide out.
My friend spent hours tracing the circuits in the beast.
The mechanicals in those old Baudot (sp?) machines was magical to watch.
I for one quite like the concept.
The current stuff is way too complex for me - with my poor memory for pure facts, I have a hard time finding things not my own, or knowing how to do stuff that folks like Liam can instantly access. For example, when an update to Apache comes along that changes the config files, I refer to a piece of paper beside my monitor that says "systemctl stop apache2" to stop the server so I can then go undo any config file changes I don't like (I run an absolutely minimum configuration). Similar for the mail server.
Having fewer directories, with meaningful names, could make things a lot easier. Do they also do it for things like the maze of C header files? As a programmer, I curse every time I have to actually find an active definition for something in, say, the "bits" universe. I sort-of understand how the arrangement came to be, but I certainly don't like it.
I first hit UNIX systems on a PDP-11, then SunOS, so I do know the basic structure, but I'm waaay out of date, and always will be.
Looks like pretty good code to me. I have familiarity with 8080 code from decades ago, and that helps. Figuring out the meaning of stuff in the assembler, and twisting my brain to octal (been a hex guy for decades) takes some effort. Lots of comments in the code, which is good - way too many "programmers" nowadays either don't comment, or have terrible comments. I haven't spent more than a few minutes, but it looks like the whole thing uses lots of lookups in tables. Uses CPU status flags as character classification - nice!
As a computer *user*, I'd say that desktops, GUI's etc. *should* be boring. They are tools that should help me get my work done. One of the big things there is that they should be predictable, and not depend on precise muscle memory for easy use.
That is of course the complete opposite of modern Windows and Google-branded Android (haven't played with a Samsung Android, but I do recall that the older versions were quite usable).
This article does indeed bring back some memories!
I don't think any of the UofA lab PDP's had floppy drives. That top unit is a fancier version of what I had on my first/second home machines - dual 8-inch floppies. Clunk! Whirrr! Clunk!
I *think* the only DEC machines I used were a PDP-11/45, a PDP-11/60, a PDP-9 and a fairly large VAX setup.
The predecessor of Draco was developed on the 11/60, until I found that the beast wouldn't execute some instructions properly (it did not like to be treated as a stack machine!), so I was let onto the 11/45.
I sympathize with the C/C++ folks, I really do. Even though I've designed and implemented a couple of replacement languages (Draco and Zed). If someone is really good at something, they are likely happy with that. Any change is going to to take away from that. Those at the very top of the game will be impacted the most. Those just learning may not much care what language they use. If you know the syntax, semantics and rules of several languages, you probably aren't *really* good with any of them. "Jack of all trades, master of none." The human brain can only do so much.
So, forcing C/C++ wizards to change to another language is going to cause them a lot of pain, and reduce their productivity. So, adding rules, etc. to the language they are a wizard in is perhaps less painful for them. BUT. It *must* be enforced. Expediancy or a bit of efficiency cannot be used as an excuse to violate the new rules. And yes, management must be in agreement on this, because it will affect their plans and schedules, and *those* cannot be allowed to force rule violation.
If someone comes up with a new language that can solve the problems, I believe it should not attempt to look too much like C/C++. The reason is that the similarity can trick the experienced programmers into doing things that the new rules will prevent. => frustration. But, I believe the "style" of the new language syntax should be similar to what the programmers are used to. Changing the style of function headers, declarations, etc. shouldn't be done just because you can. There must be real advantages. For example, I really don't like the "." cascades that Rust seems to encourage - the code is ugly to my eye.
Most folks know that C/C++ declarations are awful. The "cdecl" program that translates between C syntax and English was written long ago, so the need isn't a new one. Swapping things like which side the "*" goes on makes a big difference - lots less parentheses needed.
Nice picture, except...
Bad choice for a landing spot. A couple meters further "back" and the landing could have failed, or the thing could have tipped over.
Looks like they put in blast marks for the landing engines, but I don't see any footprints - how did the astronauts get there?
My experience here is from long ago, but...
It may matter as to how long the polling takes. If control registers/whatever have to be used to enable access to whatever can report that work is ready, you have to include that cost (plus the cost of putting the system back to normal operating state, even if that ends up being done in a quite different context) in the cost of polling.
In a very low-end old-style system, you could have a UART/whatever that makes it status available to the CPU with one access (likely a longer access than to RAM, but still). Polling will be cheap.
On the other end of the scale, perhaps your device is smart enough that you can give it a bunch of memory resources that it can fill up without needing the host CPU to do anything other than handle whatever is ready, when asked. Streaming right along with few needed interrupts and little overhead. I thought ethernet chips started doing this decades ago. What am I missing?
Stuff in the middle might need more host attention, so the tradeoffs will differ.
Given the reported improvements, clearly I'm missing something.
My direct experience was made a bit simpler by having a CPU with only one core, so full/empty flags in memory were enough. Incoming messages went directly into buffers, all within one interrupt (the hardware wasn't quite smart enough to properly clear the "empty" flag, if I recall correctly). The further-processing thread could take buffers that the interrupt code had marked ready, and could return buffers to that code similarly, all without anything waiting. Only when there was no more work to do in the in-memory buffers from the interrupt code would the thread go to sleep (after setting the "wake me up" flag of course!) Another variant for different hardware had a Linux device driver and my code in our hardware use similar conventions between them. These are essentially combinations of interrupts and polling, which is of course what you likely want.
Well, the orange-haired one has done lots of things to move the USA towards fascism, and I think I've seen comments on El Reg expressing that opinion, so it must be true. A fascist state wants control of everything. It can't control Linux, so Linux must be banned. Facebook is only the first...
A lot of those names/events are ringing bells for me. One of the members of the Algol68 committee, Barry Mailloux, spent time at the University of Alberta. *My* first programming language was Algol-W on an IBM mainframe. I wrote my first compiler in it.
Two guys from Saskatchewan wrote an Algol68 compiler in IBM 360 assembler. Large card decks - I think I saw it once. Their product went by a couple of names, one was FLACC (Full Language Algol68 Checkout Compiler). After that wound down somewhat, they moved into parallel computers - they were two of the founders of Myrias Research Corporation, which built and sold parallel computer systems based on first the Motorola 68000, then the 68020, then the 68040. Customers included various government organizations, including in the US.
There was a grad student there who wrote a PL/360 compiler, and I'm pretty sure he invented the concept. Name James Heifetz (sp?). The idea was that it was similar to PL/1, but simpler, and aimed at producing low-level (only IBM 360?) code.
The complexity of the Algol68 Report was beyond me. Heavy use of a W-Grammar. It came out as one issue of SIGPLAN Notices. My copy was loaned out and lost long ago. But, being in that environment, you can see why I put some reversed keywords into Draco, my AmigaMUD language and now Zed!
Amusing personal note: At one point I moved to a condo in a building overlooking the river. I then found out that one of my immediate next-door neighbours was Prof. Mailloux's widow. Got to know her somewhat in the years I was there.
You would need a lot of previous declarations to understand that. What is happening is that Algol68 allows spaces in identifiers. That includes named infix operators. I *think* the capitalization indicates operators. A few of those early languages made case required.
Yeesh. I don't remember enough of the base words to figure that out...
I did a bit of programming in Forth, way back. The lack of an easy way to store the program is what killed it for me.
I had read somewhere (USENET?) that the inner kernel of Forth could be one instruction on a DEC PDP-11. So, I gave it a try. You can, but you are better off not doing that since it makes it too hard to access stuff on the stack (or something like that - its been a long time).
I'd have to Google or something to know what "Jazz Woodbine" is.... :-)
The concept of being able to execute expressions, etc. right at the usual command prompt is indeed useful. It's one of the first things I put into my AmigaMUD system (late 80's?). In systems that allow significant syntax for non-expression command lines you usually want a prefix character to select expression-evaluation mode to make it unambiguous. Argghh - my old brain can't remember, but there was an early system that used a close parenthesis (')') at the beginning of a line as such an escape.
Are files really an abstraction? They are usually implemented via a name lookup table (file names) pointing at a description of the storage that contain the file's data. Given that an externally visible representation of that storage could get real ugly, and almost no-one would want to type it directly, I don't see this as an abstraction situation. It's more of a "We need names for such things!".
Also, the generalization from just file names to full file paths (and the concept of current working directory in a hierarchy) quickly became mandatory - very quickly the ability to organize your names (and re-use names in different contexts) made things much easier, even though the concept of a name hierarchy added complexity. The lack of a file hierarchy on early systems like CP/M worked when all you had was floppy disks with little capacity. Even an early 5MB hard drive got annoying - you had to remember the (numeric!) USER groups to go beyond the one set of names. And, given the filename length limitations, creating your own personal hierarchy within the names you used wouldn't be very satisfactory.
As for using numbers of varying lengths as a kind of hierarchy, I know *I* would have had to have a piece of paper beside the teletype (hey, we are going back as far as we can, but I'll ignore front-panel input switches on computers) telling me what the various numbers were supposed to mean. If I was paranoid enough, there would have been copies of that in several places since it would have been crucial to the use of the computer. Nope, give me file names anyday!
You on pot when writing this article Liam?
Historical stuff is interesting, but the part about line numbers and no files is bonkers.
Actually, I think this is an issue that comes up more often than folks think. Some people, like Liam, appear to have excellent memories for bare facts. He's good with administering tons of Linux variants (lots and lots of weird command names and syntaxes). Now he wants folks to remember functionality by number! Maybe Liam can do that. Lots of folks can't. Using files lets one use *names*. Most folks can remember names much better than numbers.
You see the same sort of effect in programming languages. Some use a small number of words and lots of punctuation and other syntax. C++ is an example. Others use lots of reserved words - e.g. Pascal. Which kind you prefer (all other aspects being roughly equal!) may well depend on your ability to remember bare facts.
As for GOTO's... Many organizations and programming standards forbid them. One of the main reasons is not the GOTO itself, it is the place that it transfers you to. You see the label, so you know that there is probably one or more GOTO's that land there. What you can't readily know is what conditions exist (variables, etc.) when landing there. Again, a very good memory for facts can let you know that with some accuracy. A poor memory could require you to scan for that label and examine the conditions active at all of those GOTO's.
A friend is doing some temp work at an Apple store and wanted another friend and I to go in and try them. He is *very* nearsighted, so their set of pre-made lenses couldn't match him and so it wasn't super good for him, although he was impressed. I've recently come down with Diplopia (eyes don't align), so there is no way they could have worked for me. C'est la vie! We were assured that if we bought our own sets that custom lenses would work for us. Neither of us was at all convinced.
Yes, an odd kind of thing for El Reg. That said, I wanted some new noise-cancelling headphones for when some parkade work, including jackhammering, was happening for a month or so in my building. I got a pair from Amazon. The brand is "runolim". CDN $50. Runtime is listed as 70 hours. When connected to a phone, you can use buttons on them to control the playback - e.g. forward and backward in your playlist. I doubt these would completely block out carol singers, but then that's not a priority...
"Rust's decision to call its tagged unions "enums"? That's standard in the academic/functional world where they're known as "data-bearing enums" because, in the reverse of a C union, the discriminant is mandatory but the payload is optional."
Interesting - I hadn't thought of them that way. In Zed I can have a 'case' field in a record (my mind was tracking Pascal variant records), and enums, so its explicit. In Rust, can you use one of the "enum" types as an array bound type, thus requiring all indexes to be "enum" members? I've found that to be a useful concept.
Wasn't most of the Burroughs mainframe OS written in Burroughs Algol? I'm guessing the compiler hosted itself as soon as possible.
I imagine they had Fortran and Cobol (maybe even RPG) compilers as well, likely written in their Algol. And no, I have no direct experience.
I first learned C with the real K & R C. C has become universal, because you can do whatever you need (or want!) in it, because it is fairly easy to write an acceptable compiler for it, and because it was (not is!) fairly easy to master. C++ extends and generalizes C. It is not easy to write a C++ compiler, nor is it easy to master C++. (Quick: explain up-casts and down-casts; explain how you specify a specific allocator in a constructor; explain *exactly* how identical method names are disambiguated; templates; ...) I will admit that I don't really know the answers to any of those, since I'm not a C++ guru. I understand basically *what* C++ does, but the syntax details and detailed rules, no.
There are a couple of problems with C that most folks acknowlege, but can't be changed. One is the confusion between pointers and arrays - how many bugs has that caused? Another is declaration syntax - making the declaration of a variable be the same as the use of it was a fine idea, but it lead directly to the "cdecl" program. In Zed, my answer to that one is twofold: make "*" postfix, not prefix; and make the order in declarations be the *reverse* of the order in usage.
Anyway, all of these "safe" C variants are constrained by the way the language(s) can be used. Many of those issues go away with fairly small changes in the languages themselves. But, you can't change the languages. Automatically compiling a modified C/C++ syntax into the one compilers accept has been done, I believe. But, they didn't gain a lot of use, as I recall. I imagine that's partly because they didn't really solve any of the root issues in how the languages can be (ab)used.
So, we start afresh with a new language. But, I believe that significantly changing the syntax style of programs (which I see in Rust, and in Zig), only makes it harder for programmers to adapt, which inevitably lowers the adoption rate. Make the syntax too punctuation dependent and you make the poor typists happy, but you make the language harder to dabble in. Make the language too verbose and you make readers happy, but poor typists unhappy. You have to find the right balance.
What good is a new language when there is sooooo much existing C/C++ code to deal with? You handle it by writing large chunks of new or replacement stuff in the new language, and by making calling conventions match so you can plug the new stuff in. Your entire program/system is not "safe" until all of it is changed over, but you are able to make continual progress.
And there is more to it than just memory safety. C enums should be "Enumerations", not just a way to define some names. If you need the latter ability occasionally, have a different kind of type for them, which is hopefully used infrequently. (In Zed, I have 'enum' types and 'oneof' types which match C "enum"s) What about arithmetic over/under-flow? Both can lead to what seem to be memory problems, but aren't. In Zed, I run-time check 'enum' and integral operations - if you don't want them then use e.g. 'bits8'. Many of the checks can be omitted by code generators based on knowlege from the common semantic code. C 'for' loops should be replaced by proper loop constructs which do what in 99.99% of the cases are really needed. For the rest, use "while". Fallthrough in "switch" statements should be explicit, with the default being to not fallthrough. Then you can get rid of "break", and the semantics of your code becomes clearer. If you need to get out of loops prematurely, then put the loop in a function and use "return". Semantics are much clearer.
Etc.
Poor formatting inside that box - can El Reg make the box wider to get rid of the line wrapping?
All 3 examples are pointer incrementing. In all 3 the compiler can know what chunk of memory the pointer is pointing at. And it still has to add run-time checks on the incrementing. What happens if you pass a pointer to a function and increment it inside the function? How does the runtime know when the increment is bad? The typical solution there is to add more information to pointers, so that their upper and lower bounds are known. Lots of extra cost there.
If you want to use a pointer to scan along a known-size chunk of memory, then require the use of a "for" loop where the iterator variable cannot be changed by user code. Then you can avoid all runtime cost in that fairly common case.
I'll wait to see what comes out, but so far I don't see much of interest.
Nonsense. Don't try to be definitive about things you know nothing about.
The first version of my Zed language started in C (actually it started in my Draco language since I started with my AmigaMUD language). Stayed that way for a couple of years as the language evolved. Eventually I added dependency on compile-time operations, and they were only in Zed itself. The standalone "zedc" compiler (Zed => X86-64 .o files) is about 99.99% Zed. It'll be 100% once I get around to translating a few small run-time routines. It does not translate to C or even to assembler code - it goes from source to .o files.
My first compiler was in fact written in AlgolW. C has never been big on mainframes. My second compiler was written in that first one.
Just scanned the Wikipedia entry for Zig. Will have to look more closely.
Just from the summary, I'd say Zed and Zig are semantically similar, but Zed uses reserved words where Zig uses punctuation.
I initially planned to use compile-time procs to implement generics, but it got icky(tm) and I made them part of the language. I don't think I regret that choice.
Oh, on the issue of polymorphism, Zed has "capsules" (single-inheritance classes) and "interfaces" much like Java ones.
Zed and C mix well - in the initial implementation its a mixture of the two. Currently, my Sys and RunTime libraries are C code, but bigger stuff like Fmt, Elf, Terminal are Zed code. The current "zedc" compiler is all Zed except for that runtime stuff.
Thank you.
There have been attempts in the past to make a "safe C", but I haven't been tracking them. My guess is that they don't have answers for some of the "creative" things that C programmers know how to do, and will continue to do until stopped. :/)
In many ways, my Zed language is a followon to C. Most of the things you can do in C, a privileged Zed programmer can do. But you can also do safe things with slightly different techniques. For example my '@' values are much like C++ refs, but are explicit, not implicit, so programmers are less likely to mess things up (although many will complain about the extra character).
Similarly, I use the 'nonNil' concept to let many runtime checks for NULL be omitted. (Note: there was a previous language that used the word "nonNil" for something quite different.) This also clarifies the requirements on parameters, etc.
As for bounds checking, I've got some simple "constraints" that the semantic code can identify and thus inform code generators that some runtime bounds checks are not needed. Part of doing that is that the iterator variable of a 'for' loop is 'con', i.e. not writeable. If your C "for" loop is actually a "while" loop, then just rewrite it as a 'while' loop.
Also, Zed 'enum' values are guaranteed to be in-range, so their use in 'case' constructs, as array indexes, etc. does not require extra checking (checking is done when arithmetic is done on them). Zed has 'oneof' types that are like C "enum" types.
Switching to using this stuff, and more, does take effort - I've done some of it when I translated my initial C parser to Zed, but if management demands it, its not going to kill you, so long as time is budgeted. You can even do a bunch of prep work in C (like the "for' loop stuff), checking that your program keeps working. For 'nonNil', you could add asserts at the start of functions where you want one or more parameters to always be non-NULL.
Can someone give me a summary of just what about Rust is causing the translation problems? Likely there are multiple reasons. I don't think I've seen a summary of what the issues are. Yes, Rust has a different syntax. But seriously, that can be overcome. Is it the semantics of borrowing?
I've written stuff in several different assemblers (thousands of lines in IBM-370), C, Lisp, Basic, Forth, AlgolW, Pascal, Fortran, a bunch of my own languages, and likely some I've forgotten. Many of those don't have pointer-like things so you can't do stuff you'd want. Is that the main issue? Is it how borrowing works?
(I've written a memory safe compiler with restricted pointers, structs, safe unions, allocated records, arrays, etc. etc. If I can get it to full maturity, would something like that work? See http://www.graysage.com/Zed/New . I'm close to finished doing all the hacks in Elf .o files to match gcc/ld's desires.)