Re: 404 error?
Yep, "That's fixed it for now, we'll download a copy of the files to our web server and repoint things later... anyone fancy a celebratory pint?", and of course "later" never happened.
241 posts • joined 8 Mar 2007
Or worse... when that poor sod is you and other than recognising the bodge as your own handywork, you've no idea of why you did it and how it works! :)
That's the main reason I always document my work. Finding undocumented things done by others is annoying, but when it's something you did yourself and you now can't remember what/why it was done... that's just downright embarrasing.
I remember doing similar in my Uni days on the library catalog machines. They all run Libertas which was from memory was a simple text based Unix system allowing you to search for books, so black and white screens, basic etc, but also rarely used by students. If you knew how you could break out of it and get to the shell to access other systems via telnet. So while students queued up for access to a PC, those of us in the know would grab one of those machines out of sight of the librarians, and use it to access things like BBS, MUD, and Unix based email.
Not sure what the issue is here, everyone knows that there's no legitimate need for end to end encryption online. That's what the UK government keeps telling us (along with governments around the world) to justify banning us from using it / back dooring its implementation, so it must be true right?
Yeah I remember OS/2 Warp being shown off at the Computer Shopper Show in 1994, with loads of PCs available for people to have a go on.
What really let it down was they'd clearly made no effort to tune the machines to work at their peak on those machines, so they performed really badly. A friend actually bought a copy and installed it at home on a slightly lessor spec machine than they had, and once he'd taken the time to actually ensure the correct drivers etc were installed had it running way better than the ones they had on display.
As a meme I saw recently pointed out, for an American earning $50k their taxes contribute $36 to food stamps and $6 to other social safety net programs, but they also contribute $6,000 to corporate subsidies. Anyone complaining about the first two and not the last one isn't against socialism, they're against poor people.
The Recover Deleted Items option in Outlook (for recovering a message that's been "permanently deleted" but still within Exchange's protection period) only lets you see the Subject line, date, to/from details of a message, so you have to restore it in order to view the content. Wish they'd apply the same to the Deleted Items folder!
I believe with an Exchange setup you can already set it at server level to clear older items from all users' deleted items, so they can't override it... but of course you need to combine it with ensuring all staff know that 1) email shouldn't be "stored" there, and 2) anyone ignoring rule 1 WILL find those messages have disappeared in due course. Plus having management buy in that it's needed and not just IT being nasty, for instance showing how many gigs are taken up by Deleted Items, and the money that costs in terms of Exchange server storage and backup capacity.
I think the marketing departments of many of the "cloud" providers need to take some responsibility for this. They sell their services to small companies as being easy and quick fixes, and leave out the limitations of what they offer. I've heard several small companies state that they have "offsite backups", only to find that they're simply using one of the cloud sync services. Have to explain to them that no that isn't a backup, and provides no protection if for instance their data is encrypted, or a file is overwritten, since that'll be synced as well and many of the services only keep the most recent version of a file.
Presumably you'd still end up with the same problem as now, but you'd simply be moving it up stream. So rather than one 40-tonne fatberg in one location, you'd have say 40 1-tonne fatbergs, all of which need individually removing, but which are all in smaller less accessible pipes so harder to work in.
Mid-90’s while at College I did work experience doing 1st line support for a large company that had a mix of PC’s, mini’s and a mainframe. The terminals had seriously robust, but stupidly expensive keyboards costing a few hundred quid each.
One day I had a call, user had spilt coffee on the keyboard. Went to their desk, collected the keyboard, and after consulting with the onsite engineers gave it a good wash, dried it out and returned it… all good, something new learned.
Couple of weeks later I get another call on a Monday morning, keyboard (same type) not working, coffee spillage. Great, I know what to do! Collect it, and repeat the cleaning process. Return it and discover it still doesn’t work. Find out from the user that they spilt the coffee on Friday, but it still worked so they didn’t bother calling us. Chat with the engineers and discover (I had no idea about this stuff at 17!) that coffee + sugar will eat through the membrane if left for long enough. The keyboard was replaced, and the user informed that their manager would be getting a hefty bill (internal billing) for the replacement!
I've seen many many customers do this over the years. Amazingly I've actually seen a customer with a complete folder structure in their deleted items. I assumed it was due to them deleting the folders but then discovered they'd manually created them and used Deleted Items in Outlook for storage, sorting their "archived" mail into specific folders.
First time I saw it was years ago, migrating a customer between mail systems. Due to the size of the folder + limited storage and upload speed we opted to exclude the Deleted Items content, thinking it was deleted so no longer required, and no point spending time transferring it. Customer was not pleased when they found their "archived email" missing, and just couldn't get their heads around why that wasn't an appropriate place to store messages.
"Actually. that was Windows NT4 first, and then Windows 2000 brought in the missing goodies from Win9x (DirectX and some laptop/USB functionalities). XP was the third iteration."
But technically XP was the first version that was NT based but intended for home use. Before then, Windows 2000 was intended to be the successor to NT4 for business use, with Windows ME as the companion successor to Windows 98 for home use. But as we know that sucked, and many people opted for 2000 at home as well, so ME was the end of the line for that kernel and from XP onwards they stuck to a single codebase.
"Except that's not true. You can *believe* or *think* that if you offer visa-free travel to EU27, well EEA+CH+..., they will reciprocate. However, that's not the same as knowing. That was my only point."
I don't believe or think anything, I know (as confirmed by that link) that the EU have already voted in favour of the matter. So assuming it was adopted by the Council of Ministries, it's already enshined in EU law.
> They cannot know because it's not up to them, it's up to the EU27
I'd argue it's firmly in our court to decide. The EU have already made it clear that they're perfectly happy to allow VISA free tourist travel after we leave, IF and only if we agree to have a reciprocal arrangement.
It's our government that either rejected that or failed to confirm they would allow it (can't remeber which they said at the time), but the ball is firmly in their court to allow or prevent is from having VISA free travel in Europe.
Too right, the quality of their updates has become so bad that it's hardly news worthy when they go wrong, and everyone just resigns themselves to dealing with the aftermath, or doing everything possible to delay updates and hoping the glitches are fixed by the time your PC forces them to install. IMHO the main selling point of Pro vs Home edition on a home setup is the extra time you can delay updates! I'll pay the extra just to avoid my PC being used as a test lab for their updates.
I didn't even realise there was a worthwhile call/chat option available! Always found the best option was to head to the forums, see if anyone else has the same thing, and if not post. Then wait and hope for an MVP or similar to answer, cos if it's a MSFT response you just know it'll be useless... lots of "I understand", while clearly not reading the information you've provided, and suggesting reinstalling Windows or something equally useless.
"....to comply with GDPR they need to make the slurping "Opt-In" and not "Opt-Out" as opt-out is essentially having a box automatically ticketed, like on websites marketing section when you fill in your address, which now, under GDPR, is a breach of GDPR."
Only if the data they're slurping is personally identifiable information, if it isn’t then GDPR doesn’t apply. Whether their belief that it’s not PIR is accurate is another matter of course. :)
Yeah that was my thought as well. Presumably if the suit specifically talks about them “reselling” then that may be the simplest defence for Apple. They’re using those systems to provide their service, they’re not reselling them, so even if you think they shouldn’t be using 3rd party services that’s not what they’re being sued over.
Surely it’s also beneficial to their users since it provides even more redundancy… if the data is spread across those different services, even if MS/Amazon/Google had a complete meltdown, the data held in the other services would still be available.
"GDS's estimate of savings is heavily dependent on avoided costs in departments. Estimates of avoided costs are high, based on rejected applications in spending controls."
If the savings are based partly on rejected applications, does that include every rejection even if a single project has had more than one rejection? After all, surely if a department has a need for something a rejection will just result in them reviewing it and trying again, it may not result in the need being abandoned.
So I'm designing a system to fix issue x. I submit request 1 which costs £10m and it's rejected. Issue x still needs fixing, so I review things and submit request 2 which costs £10m. Again that's rejected. Request 3 goes through successfully with a reduced cost of £5m. Now by their calculations, have they saved £5m, £10m, £15m or £20m?
Hopefully might cause a few users to change their ways by making them focus on stuff they care about.
Bad guys could gain access to your email – “Meh, it’s mostly junk anyway”
Bad guys could access our corporate data – “Yeah, but it’s not my data!”
Bad guys could claim your free pizza – “What, this is serious! Better change my passwords!”
"On July 8 this year, more than three million unregistered .uk domains – including household brands from Mars.uk, Heinz.uk, and Maltesers.uk to Colgate.uk and Lipton.uk – will be released to the general public to purchase."
It'd be interesting to know whether those big companies like Mars, Kraft, Colgate and Unilever haven't bothered registering their .uk domains because they're unaware, or simply because they have faith in their legal departments and refuse to be robbed (again, think .biz, .info, .eu etc). Presumably anyone deciding to register heinz.uk better have a damn good reason to have it, and woe betide them if they do anything with it that could even be suggested to be passing off on Heinz's brand otherwise they'll end up in court.
"The entire print, fold (or 'mutilate') and envelope insertion process could be automated, running without human intervention once the print job was fired off."
Yep, and they may not have physically done the print run themselves. So either inhouse automated insertion, or send the data to a 3rd party for them to do it.
Saw GE's printing setup many years ago which was used for their own stuff as well as third parties. Very impressive to witness, fully automated printing, folding, envelope stuffing, and even sorting (on massive volume if you pre-sort the letters into postal regions you can reduce the postal cost per letter).
The cost to the police and judicial system is entirely relevant to this case, since that's one of the measures used to determine how long a sentence to give.
For instance, from the sentencing guidelines for failure to surrender to bail ( https://www.sentencingcouncil.org.uk/wp-content/uploads/web_Fail_to_Surrender_to_Bail.pdf ) :
"When a Bail Act offence has been committed, the sentence must be commensurate with the seriousness of the offence and must take into account both the reason why the offender failed to surrender and the degree of harm intended or caused. For these purposes, ‘harm’ is not only that caused to individual victims and witnesses but includes the consequential effect on police and court resources and the wider negative impact on public confidence in the criminal justice system."
Yeah, they're either talking about the site that you're ultimately connecting to, eg the bank, social media etc, in which case doing it over a VPN doesn't magically help you if the site is dodgy, or I guess they could be talking about the portal site for accessing the public wifi in the first place, but again a VPN still wouldn't help with that since you'd need to connect to the wifi before starting the VPN.
Would have loved to see the ASA respond simply by saying "That's nice, but what the hell has that to do with what we're discussing here right now?!?" :)
"but you still now have to trust that the pilots can fly the plane which has different handling characteristics without stalling it (given there's no longer any automatic trim). And as much as I have huge respect for airline pilots, MCAS was designed to allow the 737 MAX to avoid re-certification and hence additional pilot training."
I think that's a good point. Surely when training pilots for any system which may stop working and then pass direct control back to the pilots, those pilots should be trained in handling the plane in that failed state. So if the design change alters the flight characteristics enough that you need that system to help those pilots, you either need to ensure it can't fail, or that the pilots are trained for if it does. How can they get away with essentially saying that additional training isn't required so long as the system works correctly.
They generally don't crack passwords from hashes, that takes a long time. They hash the password they're testing* and compare that hash with the one for your password. If the hashes match they know what your password is. So even if the hash for MyPassword123 is completely different to the one for MyPassword456, if the hacker already has the hashes for each interation of MyPasswordnnn it'll take no time for them to find it. Or, if they know your old password was MyRe@lly10ngP@55w0rdRocks1 it doesn't take Einstein to try hashing MyRe@lly10ngP@55w0rdRocks2 and seeing if that matches the new one.
* or more likely have a pre-prepared collection of hashes to run comparisons from
Yeah, doesn't effect GPO policies etc you've created, just the default behaviour in a standalone copy of Windows.
Great news, especially since Windows 10 makes simply getting to the "Password never expires" option such a pain in the arse I've resorted to doing it via a command line using wmic rather than hunt for where it's been moved/hidden in each interation of 10.
If your computer has been compromised due to the password being cracked/discovered, do you really think the bad guys continue using it to access your machine? Like hackers currently lose access to a load of machines each day when they hit the password reset threshold, and they have to start all over again? No, they'll have used the access to mess with your setup, and changing the password won't impact their access one bit.
I'd be interested to know which devices are apparently exempt, as my BlackBerry Priv certain isn't. Confused the hell out of me when my phone updated a week or two ago and I suddenly spotted things had changed. Really annoying as the old Hub was easily the most useful app I had, I'd long decided if my next phone wasn't BB that the first app I'd be purchasing would be the hub.
Wow, so rather than fix the actual problem, eg that their testing systems suck / don't really exist, by bringing back proper testing in house and crucially listening and acting on reports from external testers (for instance when an update breaks things despite people having already alerted them to the problem), they're working around it by just undoing the screw ups! What could possiblty go wrong with that? Place you bets on how much actual testing has been done on this new "feature"!
"Indeed, a shopping cart is the example par excellence of why you need something like cookies. You could pass session ids around as part of the URL if only they couldn't be subverted so easily…"
Except of course in the case of the shopping cart, those cookies may be required in order for the website to provide the service that the visitor is actively choosing to use, as such the website operator may not even require consent (no expert but I think contract / legitimate interest would cover it).
My favourite on Amazon is the obviously fake copies of MS Office that are available. Aside from the unrealistically low prices, the fact that in many cases they're selling Professional Plus edition is a bit of a tip off. Pro Plus isn't available as a retail product, so cannot possibly be legitimate, yet somehow simply blocking any entries for it on Amazon is apparently beyond them.
Had many customers fall for it though, insisting they'll supply their own copy of Office only for us to inform them that 1) it isn't legit, and 2) we're not touching it.
"Like anyone else, get a job that pays enough money to cover these expenses.
Most people do work they don't like purely because the income from that work is what they need to live on."
What?!? Have somehow missed the entire point of this? If an artist or other producer of content isn't good enough to earn a living from it then they also won't need to worry about their stuff being copied since it'll be rubbish! If their content is good enough for others to want to use / copy / reproduce it then it must be good, and therefore by extension good enough for them to make a living from. So why shouldn't they be able to make a living from it?
"Various studies have shown that there is more genetic diversity in Africa than the rest of the world combined but it is better stirred in the rest of the world."
Yeah I remember seeing a program with Prof Alice Roberts a while back where she talked about the genetic family tree of humans. Within Africa is was a full tree but everyone leaving Africa came from a single branch. I think it was based on some of the first genetic analysis done on people all round the world, so they could see that looking at people outside of Africa they have a certain common ancestry, but when doing the same comparison for people across Africa the common ancestry was much further back in time. Of course this was quite a few years ago, so with more recent discoveries it may no longer be accurate.
"Although the supposed equation that purports to tell you the odds of intelligent life in the universe is horseshit*¹"
You mean the Drake equation - https://www.space.com/25219-drake-equation.html - which in almost 60 years no one's come up with something better?
The equation isn't horseshit, but our ability to populate it with accurate numbers is. So the more that scientists learn about planets etc and are able to improve our knowledge of each part of the equation, the more accurate the result it returns will be. In the case of natural disasters, that's where fl, fi, fc and L come into play.
There’s a difference between knowing enough about a subject to think you know all about it, and knowing enough to know how little you actually know. I suspect many of us in the industry would admit to thinking we knew it all earlier in our careers based on the limited knowledge we possessed at that time, then as knowledge of subjects increases so does awareness of how much we still have to learn. In several areas of IT where I know a reasonable amount, if you’d asked me to rate my knowledge out of 10 back in the first few years of my career I’d have easily rated myself far higher than I would today. That doesn’t mean I knew more about it back then, rather that while I know more about it I also have a much better idea of the scope of that area and therefore how much more there is for me to learn.
I'm completely onboard with the general shock regarding a minister not knowing something so basic, in a way less as a criticism of him and more against whoever chose to put him into that position. Surely after all this time it must be common knowledge that he doesn't do IT! It would be like appointing someone who didn't know the difference between a cow and a sheep.
"Today any company president uses a PC,"
On that I call BS. I can think of a couple of big President/CEO/MD's who don't use computers at all. They have a secretary/pa, the secretary/pa is quick at dictation and very quick at typing. The secretary/pa has done this for the boss for perhaps 30+ years allowing the boss to focus on his/her job... why would they suddenly now start doing their own typing etc just because computers are more common? Computers are supposed to be make things faster and easier, and if they don't achieve that in a particular situation what's the point of using them. They wouldn't reply to most of the letters written to them personally, so why would it be any different with email?
"I thought stars cooked metals themselves, at least as far down Mr Medeleev's bedsheet as iron."
The original stars only had Hydrogen and helium to burn and then over time via fusion create some of the other elements. My understanding is that only the smaller elements form that way, and it's not until the star dies and explodes that you get the higher numbered elements (including metals above Iron). So now you have a gas cloud containing a much wider variety of elements, so stars that form from that new cloud will contain those, and have more metal within them from day one.
So for instance since gold is only created within a supernova, if you detect it within a star then it must have formed from a cloud created by a previous generation of stars, while if it has none of them (eg metal poor) then it's much older and possibly from an earlier point.
I struggle to see how much of a real difference this will actually make. Excluding money they ferret away to other countries, the main reason as I understand it that the big companies pay very little in the way of corporation tax is that it's based on profit, and companies like Amazon they spend it / reinvest it rather than leave the profit on their books where it's taxable. So how is a new tax which is also based on profits (according to what Hammond said in his speech) going to help? Surely they'll just continue as they do now and still pay very little tax since they still won't "make a profit"?!?
"No employee should be able to fully export their payroll data and take it out of the building."
Did you even read the article to the end? It was his job specifically to export that data!
"Skelton, the data thief, was an IT auditor for Morrisons."... "After external auditor KPMG asked for copies of various data including the entire company payroll, Skelton made a private copy of it from an encrypted USB stick."
So not only was he the one tasked with making the copy, the export had been made to an encrypted device which to my mind suggests Morrison's procedures had taken care to protect the data in transit, but he while knowing the details to access that secure drive made the copy from there and not from their systems directly (so avoiding any audit logging they might have in place for tracking mass exports).
Biting the hand that feeds IT © 1998–2020