Re: Choose your paranoia level:-
He says, after expressing his superior attitude...
692 publicly visible posts • joined 3 Aug 2020
We agree. My point was, modern programming involves a lot of creating something approximate, pushing the button, and sorting out whats wrong with it. Having a seperate typist would mean looking over somebody's shoulder, rather than having them input your written code.
Then somebody is going to say "the old way was better, we had more correct code!" to which I respond, "no there was still lots of incorrect code, but it took 100 times longer to create and was much harder to modify".
It is worth remembering that unions didn't come into existence for nothing:
https://en.wikipedia.org/wiki/Triangle_Shirtwaist_Factory_fire
And in today's Tesla thread, somebody called out a very nice quote about how capitalism rarely produces safety; regulation does that.
"I can't really see why this shouldn't still be the case."
This is a joke right? I can understand how this would make sense in the cretaceous era, when everybody was timesharing and it was very important to ensure that your (presumably very short) code was formally correct before it got anywhere near a real computer. That's not how things work now though. If somebody had to do my typing for me, it would literally be a matter of me sitting next to them saying "put a curly-brace, then this variable name...actually, I don't like the readability of that, lets move it down a line and put the declaration here. Ok, press F5 and lets see what happens. Oh, compiler wants an explicit type-cast here, lets just fix that."
Introducing another person between keyboard and chair would massively reduce programming productivity.
When I look at this kind of thing, my first thought is usually "these people are nuts, there are always lots of things that need to be done. It doesn't matter if you get rid of one task".
I have to remind myself that not all jobs are like that. We are fortunate to have such flexibility in engineering type roles. Further down the food chain, people's livelihoods really do depend on a very narrow set of tasks they have been contracted to do (to move to another task, they would have to re-interview for a new job). While I don't support avoiding increases in efficiency/quality just to artificially keep people in work, I do understand the paranoia, and I can't blame them for not really caring that they are making things worse.
In contrast, I am actively trying to impart certain skills I have to co-workers so that they don't need to come to me all the time. Completely different mindset.
We had some intern resumes recently that listed "navigated the internet with ease", "Skilled at navigate[sic] the internet" and "Ability to use social media such as LinkedIn".
Under normal circumstances that would be bad. We are a high technology company hiring SE and CS students. Yeah, you had better be able to use the internet. And given that this is a writing position, you shouldn't have typos in your resume.
People often have different expectations for how they will be treated in a corporate environment than in their personal lives. Especially on their first day. You should probably default to not asking the women at work to show you their tits. That is also a good rule for real life.
Not really a joke along the same lines, but my home PC has the C64 start screen as a background. Of course, to end a teams call, I have to back out from RDP to my local computer. Several times when I have been screen sharing, I have heard somebody laughing just as the call ends and then gotten a message saying "Sorry, I was laughing at your background".
Yes, and therein lies the problem. When it comes to an always-on computing appliance, too much is too much. I keep my NAS in a closet. The power brick is rated for less than my normal computer draws at idle.
A dual-i7 machine needs to be properly cooled, will make more noise unless you invest a lot of money in big coolers, will draw lots of power, and will be much larger than a toaster (a 4-bay NAS isn't much larger than if you took 4 HDDs and put some thick paint on them). Also, if you don't already have such a machine, it will cost you more than you first expect to buy it.
The advantage of running programs on a NAS is that it is a small, low-powered, always-on appliance. For example, I run SVN, MySQL, and a UT99 server off my NAS. Otherwise, I would have to run all these things off my PC (which uses more power and is therefore not always-on) or set up yet another box (which uses more power and will need to find a home somewhere in the apartment).
This became real for me when I was required to take a buisness writing course. For one exercise, we had to respond to somebody we were giving bad news. I had marks deducted for NOT using the passive voice.
This is merely a microcausm of the things that disgusted me about that course. There were many times that we were given advice which, in by subsequent role as a technical writer, would have gotten me into another round of editing.
I feel it is important, now that I have had my coffee and bothered to double-check, to come back and correct myself. That song was sung by Freda Payne, and therefore the premise of my joke was incorrect. We regret the error. Please accept the following, revised joke:
Are you suggesting that The Supremes should have kept them hanging on?
At a greek restaraunt we rather like, their website tells you that using the UberEats link they provide gives Uber a much lower cut than if you use the UberEats website or the App. I presume that this is because if you go to the restaurant's website, you are no longer using the UberEats discovery service (basically, the restaurant is now selling ME to Uber instead of Uber selling ME to the restaurant).
It occurs to me that somebody could build a tool that aggregates these links as an "open" discovery service. Then consumers get a convenient interface and restaurants get paid.
Obviously the problem with this is that the Uber et al would do everything they could to break it, all day, every day.
No no. If I do that, then every computer problem she ever has will become my problem. I have yet to see her touch a computing device she couldn't screw up. No no, she is sticking with the thing that she knows best because I am NOT spending the rest of my life being blamed for the fact that she lost all of her photos for the fifth, sixth, and seventh time. No no, she is sticking with something that she at least kind of understands and which my dad can mostly hack.
I set up a simple NAS for her to back things up to. All she has to do is run freefilesync, which requires pressing an entire 3 buttons. She mentioned today that she never uses it because it's not user friendly enough. I asked what would make it more user-friendly and she said it was just about "spending time to learn an new thing".
You do not have my mother, destroyer of devices, breaker of software, loser of files.
No. I am not setting up Slackware for her. I will not like it.
I try to explain this exact thing to people, and it usually doesn't go down well.
I recall a couple years ago, a 50-something long-time Mac user (a technical guy, but not a computer expert) staring me in the face and saying in a very serious voice "Windows is crap, only Macs are good". He was sure to say it very slowly so I would understand. It was like some kind of holy scripture.
With that lesson learned, I went back to my Linux/Windows dual-booting Dell laptop with the C64 background to continue what I was doing.
I ask you to consider the XBOX power button, which is inexplicably NOT the huge button with the ring around it which is OBVIOUSLY a power button. Instead, it is the small button underneath which is OBVIOUSLY an eject or reset button.
And then there is the thing on the center of the controller, which is so OBVIOUSLY a button that they decided to actually make it one for the next iteration of the system :p
I agree with you in principle, but AFAIK in the specific case of medical equipment: the reason things don't get updated is that equipment is is qualified as appropriate for medical use as a whole. If you swap out a part, you need to replace it with an identical other part (or sometimes an approved replacement).
This is AFAIK. I am not an expert in this.
That's all well and good, but the fact that memory access is so slow is a problem for writing efficient code.
Yesterday I was discussing implementing a large data set as a linked list so it would be very efficient to break off chunks of it and move them to over objects. He point out that the disadvantages of memory fragmentation were so significant (largely because the data won't be loaded into cache together) that it was usually better to just put the data in an array and accept the cost of copying things when it happens.
Yes there is alot of inefficient gibberish in the world (interpreted languages still exist), but even when you are trying; the unbalanced nature of modern hardware performance heavily favors some practices that should logically be slower.
I'm not sure how long ago this was, but depending on the decade a female programmer might have found it more difficult to change jobs (especially if she didn't get a good reference). I also wonder if that exacerbated the embarassment of the "slighted" employee.