I swear at Confluence every day
See title
909 publicly visible posts • joined 13 Aug 2007
“Boards need to be curious and discerning about the information provided to them. You can’t PowerPoint your way out of this risk. Don’t let management do that to you.”
I guess this is sugar coating the pill that it has been the boards refusing to invest, rather than being misled but their minion management. Whatever the cause, the quote seems to be the wrong way arouund.
I guess there is a risk of "reputational damage", though they don't really have any positive reputation to speak of.
If an employment tribunal is called, and they loose (quite likely if the info in the story is a fairly complete picture) - what do Rockstar really loose?
The C Suite don't really care if they kill the company, so long as they have enough time to sell shares / get out to the next patsy. Shareholders only care while they hold the shares. Its kinda odd that the folks most invested in most large companies are the grunts, who have the least say / influence.
Its very "told you so", but C suites have wilfully underinvested in IT for a decade or more, while the risk was very clear. The costs are becoming clear - but for every C-Suite person, its a risk profile of "so long as it doesn't happen on my watch". And they are a bunch of risk-takers, as they can always golden-parachute out when things do go wrong. Given that system, how companies learn? Perhaps the smaller ones can. Fingers crossed.
Sounds like a made-up number to sound good, but once anyone claims, then there's the need to "prove" loss of income, which I imagine to be near impossible. And loss of profit certainly wouldn't be much for the huge majority of books copied. So they can say any headline-grabbing number they want, and for those few prepared to lawyer up, pay out $100 or so should they successfully grind through the process.
It is tiresome in the extreme that whenever somebody/something wealthy has a ruling against it, an appeal seems almost immediate.
Doesn't there have to be formal ground to appeal? And some likeliness of success established?
I assume its always been the way to protect the rich, and pay more laywers...
Ignoring the snooping / tracking "feck your cookies" dialogs, the thing I'd really like is for pages not to reposition items in the view as the page loads.
I want the button I'm about to click to stay the button I think it is, not shift layout so the mouse is now over the previous button 100ms before I press it.
There is no value to "Google's own app catalog" other than the restriction that, without sideloading, you have to use it.
A worse search engine/discovery interface one would struggle to find - even worse that Amazon, and that's a very low bar.
This concept might be appealing for those in industries already facing staff shortages. However the real concern is AI's potential to cut staff...
I assume most of such industries are not short of staff because they can't find suitable staff: rather they refuse to pay for them as a business strategy, the solution being to have current staff doing <insert number>% more work than is reasonable. These industries are already hamstringing themselves to reduce the number of staff. Any sniff of an opportunity to reduce the staff pool still further is very likely to be taken with both hands. Again, almost regardless of how much damage it may do to the functioning of the company.
Is the company alleges not to have read its own ToS - then they are on shaky ground assuming their users did.
If it claims it didn't understand its ToS, again, users now have an open door to claim the same.
Or perhaps these shoddy ToS sections need to be regulated to something meaningful.
At least nobody believes the company exec line, as indicated by "I am very concerned that Ukraine's counter offensive was monitored in real time and troop locations were exposed to facilitate drone strikes"
While we've all become accustomed to nonsense being spouted from exec's about data breaches, I wonder if Ukraine will accept such glibness in their current wartime scenario?
https://www.bbc.co.uk/news/uk-66510136
But it OK, according to them, 'cos "data was hidden from anyone opening the files"
Those well known sure-fire ways of keeping data private one assumes:
Excel Hide column / Word version history / black superimposed highlighter / White text on white
I wonder which one they used?
The tech paper behind this story is here:
https://news.airbnb.com/wp-content/uploads/sites/4/2020/06/Project-Lighthouse-Airbnb-2020-06-12.pdf
I couldn't see if other variables (maybe #previous uses of Airbnb and reviews thereof) were also visible to the property owners. If any such data was there (I've never used Airbnb so I've no idea how the system works), I could see no attempt to establish if there was bias in the other variables between the perceived racial groups. But the tech paper wasn't a nice read, so I'll admit I didn't spend too long trying to understand it.
The irony is strong in this one
https://www.mcgill.ca/oss/article/critical-thinking/dunning-kruger-effect-probably-not-real
I'm not actually going to ague the arguments - I do know that I don't know enough - but given the context of the Godel / Turin chat, the concept of (potential) equivalence made me smile
Ausculation has been on the way out for quite a long time: many doctors aren't properly trained in it nowadays, making some of the measurements a tad difficult to assess without a more in-depth read (which I haven't done obvs.)
I wonder if doctors will keep the stethoscope as a "symbol of medicine" once none of them know how to use it?
One can imagine a future where many a site will use NLP models pulled from <wherever> to power their websites, or models for reading barcodes, or models for correcting gramar, or models for <insert task here>.
They will be the same users that pull <whatever> set of javascript dependencies to sanitise an input string and have neither the interest nor the skill to debug 'their' work.