Re: corporate employee or fulfillment center associate
Read The News. The government explicitly want to roll back environmental standards, and remove food origin information so that shitty US food can be introduced in a trade deal.
1136 posts • joined 11 May 2012
No need for business intelligence software, run a report each week that outputs all customers with a refund amount or number of refunds exceeds a given boundary in a recent timescale. Review it manually, the report should not be that long, even with a large number of customers.
It would be much better if mail clients defaulted to plain text, and switched automatically to HTML if formatting is added. Outlook can certainly be set up to default to plain text, and if formatting is added prompts if HTML is desired. For 98% of my work e-mail plain text suffices, to handle the other 2% containing screenshots and tables HTML works.
HTML is a huge pain when encountered on plain text mailing list archives, it's not easily searchable, and frankly unnecessary most of the time.
Sarah's partner shouldn't be surprised by the OpenBSD plain text requirement, it's signposted clearly on the website, and really is not difficult to do.
Frankly I'm less than worried about Linux contributors - it's overflowing with those, and well funded by contributing commercial companies. I'd be more concerned with the BSDs or minority OS where the contributor list is small.
I'm getting into NetBSD kernel development and the largest barrier to entry so far is getting it working on hardware that's really good for 2012 (I suspect dual CPU and/or multiple PCI-e bus shenanigans) as I'm unwilling to buy multiple systems to handle dev, gaming, and Windows. Fortunately after a lot of faffing I've set up a dual boot to ESXi, and hope to use that to host a NetBSD debug source until I can fix the issues that prevent me running it on bare metal.
With respect Dio, you haven't thought this through. Not only does this affect the submission process, which may force some contributors to use a different client or in some cases hardware (if they're using something really old), but more importantly it affects the mailing list archives which are both distributed and easily searchable.
If someone wants to write a shell script/program that automates this, there's nothing stopping them, but the one thing it absolutely shouldn't do is pander to HTML mail clients, which just make parsing harder.
£300 quid will get you a decent 1440p capable GPU, anything above that is dubious without limiting yourself to 30fps or compromising on visual quality.
I thought the same when the Next came out the first time, but I didn't look very closely.
It isn't a modern 8 bit computer with bells on, it's a well engineered FPGA system capable of running a number of cores as they would run on the original chips, centred on the original Spectrum chipset with a number of graphical and sound enhancements, plus e.g. a viable CP/M environment.
More importantly it's a modern supported platform with an associated community. You could emulate a speccy and write your own code to add emulated features, but the chance of building a contributing community off that is low.
The alternative is a DE10 with I/O board and case, which won't cost an horrendous amount less than a Next, and doesn't include the same expansion facilities of the Next.
I was tempted by the DE10, but I suspect I'm going to go with the Next instead.
No, I think that's unfair. I personally think Microsoft has swung too far in the direction of standardised configurations, but most users (well over 90%) have a very standard mass market PC with one processor, one graphics card, and one SSD.
For that 90%+, Windows 10 works just fine.
However, with their rolling release program they also use too many real life users as guinea pigs. I'm not in favour of this new development style.
Not so much if :
Your hardware is unusual
Your hardware is old
You want to do something odd, or work in a non approved Microsoft way
After initial issues with hibernation with my work Windows 10 system, updates have made laptop based Windows 10 working pretty solid. I haven't played much with WSL, but Microsoft has to be applauded for it.
For home in a moderately complex configuration there are a load of issues
Have older monitors? Windows 10 is not entirely happy with KVMs
Old hardware? Recently had to re-install Windows 1903 from scratch because an X-Fi Titanium absolutely Will Not Work on a 1909 system that's had various cards installed over time, even after driver re-installs, driver cleaners etc.
The mess of control panels in win32 and UWP format, both of which are required as mentioned by others.
The graphical boot manager, also used in Windows 8, sucks majorly. Revert to the previous one, please.
Automatic driver installs are a *huge* pain if it will break your system configuration. I've had to install with a network connection removed to stop driver updates.
Windows Mixed Reality can break badly and is nowhere near as seamless as Oculus
It's impossible to identify disks during install without selecting a command prompt and using diskpart, at which point a reboot is required to return to the install! So, you have nine disks in a RAID controller, but no idea which one to select.
It'd be nice if HyperV was a little less limited, especially easily assigning serial ports to a VM, although I realise this is a minority requirement.
Having said all of the above, and despite the fact I'm trying to move everything except some gaming and VR towards Unix, Windows 10 really is as mentioned a solidly engineered product. It copes with practically no issues with disks in a RAID JBOD configuration, something that makes almost every other Unix scream. My complaints are somewhat oriented to poor driver quality (I don't know if I can blame Creative Labs for the X-Fi driver issues I've had, but I definitely can blame AMD for their deeply shitty Vega 56 drivers which *STILL* won't load their control panel if the graphics card isn't set as primary in Windows)
You do know that multi threaded programming has been fully possible in C right from Windows NT 3.1, not to mention OS/2 in the 80s? (Although the file dialog is written in C++, and calls a lot of interfaces).
Microsoft are not stupid, and each new release of Windows has been optimised beyond the previous one (Yes, sometimes this is accompanied by a lack of flexibility or functionality, or increase in requirements in other areas).
If the file dialog doesn't populate bit by bit, there's a reason for it.
Set up your kernel debugger (preferably on a pair of VMs if you can), choose the area you want to change, read the code, start making modifications, repeat until bug free. If you've been coding for 20 years you'd definitely be up to the task.
There's plenty of people to ask about kernel hacking, and if you need to do architecture specific functions, processor documentation is usually pretty decent. It's when you need to prod specific chipsets and add-on card that specifications are less available.
I'm shortly about to start some kernel mods for an open source OS to add functionality to an older system, and once I get the hang of that, something more complex. I know which bit of code it's running (because the module is named, and relates directly to a source code file), so the first task is a breakpoint. Then step through till it gets to the 'your system is too old' part. At that point improve the error message/check the logic to ensure it's rejecting the system using the correct method.
If I understand that bit I then know what new functionality is required, which means reading the documentation for kernel support functions and processor datasheets, all of which is actively available, plus some coding in C and assembly. I'm not expecting it to be easy, but I at least know what I need to do.
Gosh, wonder why that could be?
I mean if you want an extreme example, there's the ncurses library maintainership hassle : https://invisible-island.net/ncurses/ncurses-license.html
Thankless job at the best of times, even worse when you can't easily fire someone.
The registry was actually introduced in Windows 3.1, then the usage was greatly increased in Windows NT/95, as thousands of configuration files aren't always easy to use.
This was prior to widespread use of DVD, or anything similar to HDCP, so I don't see why you're pushing the DRM angle.
I would imagine that they're storing file deletion locations 'somewhere' in NTFS without bumping the file system version number, because as you say the underlying disk format hasn't changed for ages (actually since XP).
To be picky, that wouldn't be OS/2 Warp, it'd be OS/2 2.1 at most
Windows '95 was out in.. guess
Windows 3.1 was released on April '92
OS/2 2.0 was also released April '92
2.1 end of March '93
3.11 November '93
Warp was October '94, Connect versions in 95
Then Warp 4 in '96, IBM having wasted a lot of time with the stillborn OS/2 PowerPC in the interim.
Duke Nukem was pretty good at the time though, it's still fun today using eDuke32 (although the first episode is the best, I lose interest part way through the second)
Yes, I know this as I alluded to in my reply.
However, for automated processes a long chain of utilities is only 'reliable' because a great deal of care has been taken with the output, and people have shouted when output breaking changes occur (which they have). Getting to the people shouting stage, rather than having an object pipeline is not ideal.
I am also aware that there's effort underway to convert utilities to use libxo, which should make output less fragile.
I mostly like Powershell, but it's definitely not perfect.
However, if you're comparing its strengths (defined interface, some documentation), against a series of single purpose Unix utilities where the output can never be modified as it'll break things it's clear Powershell has the better architecture.
I'd rather use Python, C, etc than either given the choice though.
There's definitely a degree of that, but we all know vulnerable people, our parents and grandparents at least. The health service is likely to take a hammering, but if most people can last without medical assistance, then there's more resource for the vulnerable and badly affected. It could be much worse.
Point of note, flu is imperfectly vaccinated against. Health professionals look at the strains circulating, find a few of the most probable ones to spread, and create a vaccine from them.
If they're correct about the most virulent strains then the vaccine works and there are minimal infections, but if they're wrong the vaccine has no effect.
Better still is the return of distinct search and browsing entryfields, and not automatically (hello, Windows 10), sending application/file searches to the Internet.
However, it's not uncommon to paste things that ideally shouldn't be pasted into a search engine, and a paste preview could help this.
I remember back in the days of OS/2, of a particularly decent utility that extended the number of clipboards and offered macro facilities - very useful. I'm sure similar programs exist today.
...which is out of support. Well done..
If you're 'never' changing to 10, you'd better start using Unix now because you won't have much choice otherwise.
10 isn't perfect, but it's alright and very stable provided you do things its way, and use relatively modern hardware. I'd recommend using the Pro version too..
When you've got 500 terminals that only support TLS 1.0, and the cost to replace them is something approaching £2000 each, I look forward to the complete lack of accountancy input in the decision to drop well over a million pounds in replacing your infrastructure.
On the one hand it's a pity the very specific niche of secure keyboard phones will be no more.
On the other hand I'm still annoyed that Blackberry dumped the Priv's security updates after two years. The hardware and software was good, the lack of updates unforgivable. That lack of customer service and ability to root meant I refused to move to the TCL Key series.
Now I have an Fxtec Pro1, and there's also the Unihertz Titan on the market. The jury is still out on the Fxtec I'd say : the software is not as polished as Blackberry's and I really miss the word selection and ability to swipe in the Priv's keyboard (this may be able to be worked around with software) but the hardware quality is good. It's a landscape phone, which I love, but Android is the same now as it was about four years ago with certain apps only working in portrait despite Android UI guidelines.
 Wait! Just tried split screen mode and it works! Rotates the apps to portrait, with two at once in landscape mode. Brilliant.
Still, the Fxtec is also rootable and there are builds of Sailfish and LineageOS available. I suspect I'll be happy once I've made a couple of Android customisations.
 Granted, the Priv's hardware was also failing. Despite the fact it is fractionally over three years old, the GPS was frequently dropping out, the battery was in definite need of renewal, and occasional random reboots were a thing.
Any changes to boundary zones would be easily detected, and for the most part people won't be wandering round their house in that manner (the only headset that can currently sensibly do that is the Oculus Quest).
It's not a fad, we're now on to generation 2.5 of modern headsets (Gen 1 : Rift, Vive, Gen 2 : Rift S, Vive Pro, Oculus Go Gen 2.5 : Oculus Quest Gen 3 (when it arrives) : Oculus Quest S - Quest with identical performance to Rift S tethered, and a more powerful portable chipset). It may not quite be mainstream, but there are a number of things that work well in VR, and for basic gaming quite a lot of Playstation VR sold.
VR headsets are, however, one of the least hacker friendly technologies out there. They only sensibly work under Windows 10, open source support is practically non existent.
The principle may be alright. The reality and implementation aren't.
I still have old DVD players (second gen, mid nineties) and games consoles using SCART. I'd rather use *anything* but SCART - HDMI is brilliant, component is good, VGA is decent if the socket is well constructed, composite is shit but at least it's easy to insert.
That's not even getting into SCART switches which are even worse than plugging them into a telly.
Over time I'm going to move most of my old (pre Wii U, I might leave the Dreamcast on VGA) consoles to HDMI. It's a doddle to set up, and the switches aren't expensive. There's digital options for the Gamecube and Dreamcast, coming for the SNES, and there's at least cable options for the original XBox etc too..
That was well into the nineties, a quick Wikipedia says it was actually 2000 when it was open sourced. At that point Windows had been out for fifteen years.
Sure, gcc was free for Linux, the issue was that compared to Watcom (which was cross platform and a reasonable price) it was pretty shit.
I was mad enough to develop a custom FTP program for Windows NT using Watcom under OS/2. It worked fine and wasn't that difficult. It would have been considerably harder fiddling around with gcc debuggers. Even now they're not particularly amazing, especially with respect to live watching of variables (I've tried a few, they're all a pain). Under Windows such features have been standard for decades, and they're available freely in windbg (and some of the Visual Studio offerings if you're not developing for an organisation with more than five people in it)
Hardly. The old OS will still work, it's just that after a while it stops being supported for security fixes and should not be connected to the Internet. Windows generally receives around ten years of support, and most hardware that runs XP will also run Windows 10. This is better than pretty much any other operating system (you may be able to find a very small number of Linux distributions with Super Long Term Support of ten years).
Microsoft continue to support a lot of generic hardware drivers. For more specific devices, it's up to the third party being unwilling to put in the effort to upgrade the drivers.
Microsoft also expend a huge amount of effort in maintaining backwards compatibility. There are issues if you're using a fast track release of Windows 10, but the long term support releases are available if stability is required.
A program, still in production, created by a dev that doesn't understand exception handlers, and never checked the source into source control (yes, everything else is checked in, and backed up).
Not necessarily a problem until it fails, which it did. No error message. No logging of any use.
Fortunately a) it's written using the .NET runtime, so later on could be de-compiled and b) windbg is both free and extremely good at debugging. Set windbg to start the program and break when an exception is thrown. Trawl through all the data structures in memory, that one looks like the bit of data it's looking at the moment, and definitely isn't in a state the program will be expecting. Let's fix the data and gosh, program runs through fine..
 If you do this, you have to load one of the several .NET debugging addons. Well worth it.
Poor attempt at trolling, 3/10.
Micro USB are very convenient but I'll agree they eventually break and the sockets gather fluff. However, it's a small socket, so there's probably an engineering limit there.
SCART is awful. Multiple standards in one capable, too easy to pull out, too difficult to push in, horrible when constructed cheaply. HDMI is brilliant, coupling high bandwidth and sound in one sturdy connector.
There's nothing IDE did better, unless you count connecting two devices to one socket. Routing SATA or SAS cables is far easier.
Parallel you sort of have a point as it's extremely tough, but seeing as standard parallel port tops out at 1Mb/s, and the rarely configured ECP/EPP at 20Mb/s, as opposed to USB 2 at 480Mb/s, or wireless at an absolutely minimum of 50+ Mb/s these days, and more probably more than 100Mb/s. No, I think we can say parallel does not win.
Ethernet has kept the same connector, but gone from cat 3 cables to cat 6..
Pedantically it's also changed for portable devices. When it originated in laptops it was a PC Card device with a pigtail and RJ45 port (with one notable exception). It then became standardised into a laptop itself, or at least in a docking bay, but on some of the more consumer level devices has now gone back to a USB to Ethernet pigtail arrangement again.
Sure, code rot due to evolving environments, and technical debt exist. However open source is most useful when the software is in fairly wide usage and studied and coded by a large number of eyeballs.
When the software is somewhat niche, the only people likely to use the code are your direct competitors. Alternatively, nobody might bother at all - take the case of OpenSSL which was (is) open source, but is considered part of base plumbing, and no-one wished to touch it because it's difficult.
Obviously the trick is to open source as much as possible (i.e. replacing the third party components we used with open source alternatives would save us hundreds of pounds each year) whilst keeping your niche intellectual property closed source.
In the case of work here, yes we make money by running the software and services, not as much by writing it (there are some on-premises installations but they're limited in number, and there are sometimes highly bespoke customisations which have been highly lucrative).
However, if the code was given away other firms could out compete and innovate using our internally developed code, so no thank you, we're not giving it away.
Technically I suppose we could have decided to have zero installations on customer sites, base the entire software stack on open source technologies, and license it as GPL knowing that customisations wouldn't have to be released. At the time the software was written originally, Windows development tools were far faster and more effective, and third party components more generally available - this saved time and gathered business.
There is a load of quality open source available under Windows, because a great deal of effort has been expended porting it from Unix, plus other Windows based open source/free but closed source software. The advantage Unix tends to bring is a more integrated packaging environment.
The times when I'm not using multiple mstsc clients to connect to remote systems are low, even after converting some of the tasks RDP is used for to Powershell scripts..
Granted, less useful on a consumer desktop, but that sounds like a build breaking bug to me.
Biting the hand that feeds IT © 1998–2020