Re: Elephant in room
>truly remote areas will have immense difficulty in maintaining any tech necessary to use whatever internet these satellites can muster.
Citation needed.
A solar panel and a smartphone?
10664 publicly visible posts • joined 21 Jul 2010
If you're developing a country, it costs little to lay fibre down *before* you pave the the first roads. Even less if you do it at the same time.
I'm not coming down on the side of fibre vs satellite, though. Too many other complex factors. Sociopolitical, for example - how many tyrannical governments like shutting down internet access, how many twats like to set fire to cell phone towers or microwave relays? How much of our essential infrastructure such as food distribution and emergency coordination relies on the internet, and how resilient is ground based infrastructure to storms, fires, trawlers, civil unrest etc?
I'm just amazed by the folk here, using the internet, effectively saying that other groups of people don't deserve the internet because they'll only use it for cat pics and pr0n - just because they live in the rural USA or a remote African village.
So, to summarise the above points:
Ground based: lots and lots of telescopes, can be made very big. Atmospheric interference not a deal breaker.
Space based: Just too expensive to put lots of telescopes up there, size is limited.
Okay, so let's look at twenty years in the future. Will there be enough telescopes in space to warn us of meteorites?
I'm assuming that for less urgent, purely scientific astronomy (the cosmos ain't going away any time soon) space or lunar based observations will eventually surpass any earth based telescopes, be this in fifty or years or a hundred.
Also, spotting an extinction-level rock hurtling towards us isn't too useful unless we can divert or destroy it. That will require getting hardware into space. The development costs of getting stuff into space have to funded somehow, be it by commercial satellites, military or state posturing.
Anyway, there's a professor at Oxford who set up a department for existential threats to humanity, and he places giant meteorites quite low down on the list - though they are devastating they are rare, and there are plenty of other ways we have of screweing ourselves over.
Kapoor has negotiated an exclusive licence to use a particular material produced in a certain way... other materials with similar properties are available, and who knows, there might be reasons that Vanta Black is not the ideal coating for space vehicles.
Maybe microgravity vacuum is a suitable environment for depositing nanotubes on a substrate - i.e, develop a system for coating the satellites in orbit.
Can anyone provide a quick comparison between ground-based telescopes and space-based telescopes? I'd imagine that the advantage of ground based telescopes is ease of constructing very large structures. Is it impractical / uneconomic to use the ever cheaper access to orbit to launch a few big / many small telescopes for detecting asteroids?
The issue is that storing results involves reading the results, which of course means collapsing the superposition - which rather defeats the object.
In addition, building quantum gates that operate faster than the state decay time is still a challenge.
So no, the researchers are not confusing computation with storage.
> He didn't foresee the system where the company just takes it anyway.
What about that episode of the Simpsons when Bill Gates tells his henchmen to 'Buy Out' Homer's internet company? They then proceed to trash Homer's office, as Bill Gates tells a stunned Homer that he didn't get rich by writing cheques.
Actually, of all people who made predictions about the internet in the nineties, it's David Bowie's prediction that is closest to our current reality. He's telling a sceptical Paxman that the internet will change many things wildly in ways we can't predict, and there will be a lot of bad stuff as well as good.
https://www.bbc.co.uk/news/av/entertainment-arts-35286749
To expand my point, it's akin to the recent change in UK law that makes organ donation Opt Out, so that the default is that your organs can be used to help save someone else unless you explicitly opt out. The aim is provide more organs for donation. This, along with many other real world examples, strongly suggest that changing the default option can be more powerful than merely presenting a choice. There's even a UK Government department nicknamed the Nudge Department based on this sort of thinking.
If I want a TV, I research TVs, search for a good price and then buy a TV. Then I see lots of advertisements on websites for TVs - advertisements that are wasted inner because I already have a TV.
It would seem that personalised advertising would work better if it only shows me ads for things I have never searched for. "Ohh, an ad for a steel mcguffin, I had no idea they made those... it looks kind of neat, I might buy one!"
Helicopter = big thing requiring trained pilot (in addition to the actual single human payload) and routine servicing for single point of failure mechachinal parts
Flying Car as illustrated = smaller thing that can transport a single untrained passenger. System is tolerant of failure of any one of the mechanical modules, modules which can be easily swapped out.
If there is some reason that the helicopter will always be cheaper, I'm not seeing it.
A large part of military operations isn't combat, but logistics. The shipping pallet was originally US 'military tech' (WW2, Pacific Theatre), the shipping container was civilian invention but popularised by the US military (Vietnam War).
I haven't got an answer for your question, but such an answer might be found in the area of getting a few key personnel from a main base to a forward operating base, perhaps. The military is sure to have a use for moving a few people around quickly and cheaply over rough (but non combat) terrain.
Well, Douglas Adams was a Beatles fan ("it was hard... not only were parents and teachers against you, but you had to fight Rolling Stones fans, and they fought dirty and their knuckles were closer to the ground") and an Apple Computer fan. However, it's a logical fallacy to then claim all Rolling Stones fans are Windows fans, or that all Norwegian Death Metal fans are Linux users.
Does iPadOS allow multiple user accounts yet?
When Microsoft introduced a Children's Mode (limiting which apps and websites could be used) on their mobile OS, it seemed like a clearly useful feature - even if a phone doesn't have payment details stored in it, the fear of Little Johnny deleting phone contacts instead of playing Snake goes back twenty years.
EDIT: it appears iPads only offer Guided Access mode, restricting Johnny to a single app if he doesn't know your access code. More recent versions Android, Windows and Chrome OS have support for multiple user accounts.
Yeah, the Mod system was tied to j e size of phone - albeit a size of phone many people have settled upon for the last few upgrades. I always felt it was doomed because Motorola didn't licence it out. As you say, economies of scale, critical mass of adoption.
Still, Nokia used to have external data / power contacts across several generations of phone, and I believe Apple have external magnetic connectors on the newer iPads for keyboards.
If you remember the fear of terrorist attacks of all sorts in 2005, a department given money to develop a disguised Geiger counter sounds plausible, especially since using roving mobile sensors is DARPAs stated MO.
Audio recording could be done on a 2005 era phone, and supply data back to base in real time.
So, why an iPod rather than a then-ubiquitous flip-phone? The iPod has greater internal volume to fit sensors, especially if the HDD is swapped for solid state (one assumes that high cost of solid state storage wouldn't have been an issue for the client, though the lack of disk noise might mark the iPod as suspect). In 2005, an iPod might be allowed into areas where a phone might not be. Or, it might be that someone waving an iPod around arouses less suspicion than someone waving a flip phone around.
An iPod wouldn't send data back to base in real time, but I guess the agents could plug it into a laptop whenever they returned to their vehicle.
> Don't see how you can possibly do that without too high a false positive rate.
That is what I was hinting at by including the Acetone / Nail Bar example that the DARPA programme manager gave. False positives are expected. Their plan is is use roving sensors (they talked of cars, but chemical sensors masquerading as smartphones fit this model too) to build a huge real-time data set and then use algorithms to highlight unusual cases. Fertilizer on a farm or agricultural wholesaler = normal. Fertilizer by a government building = worthy of investigation.
I daresay that a similar approach is applied to purchases made online. Gallons of acetone supplied to a nail salon raises fewer flags than if it were supplied to a residential address in a certain area.
For sure, but a little knowledge is a dangerous thing... wait til idiots accost some poor radiotherapy patient for being a terrorist.
DARPA have a programme looking into the problem of separating natural or mundane sources from those that suggest something more sinister is going on... roving sensors and big data, etc. Not just radionuclides, but precursors to conventional explosive too. High acetone reading, ah, but it's outside a nail salon, likely mundane source.
High refresh rates are aimed to making scrolling smoother for everyone, though yes, there is appeal to gamers - and yes, the first phones to feature high refresh rates were aimed at gamers.
I've not used a high refresh rate phone, but there seems to be a consensus amongst general tech sites that it's hard to go back to 60Hz after using a 90 or 120 Hz handset.
> I'm wondering how it tells a pixel is transparent... are these "normally black" pixels?
Other sources say the TV comes with some custom content, to give the impression of fish swimming about etc. It's likely that you can play PNGs with transparency defined by an alpha layer, too.
Otherwise, it just plays normal video in a normal fashion.
We're forever seeing see-thru phones in sci-fi, films like Looper, TV like the Expanse. I never saw the point. And back in our reality, there were some see thru dumbphones on the market a few years back, with boring old monochrome see-thru LCD panels.
Sci Fi Side note: the origami folding tablet things from Westworld appear to be useful.
What shape would you say the 'promise of Apple' had then, and in what way did it fail to materialise, as you suggest? And are we poorer for it? Genuine questions, because I'm interested in your views and your answers.
Focusing on developers in important, but ultimately developers don't exist in a vacuum - their efforts can only be judged in the context of bringing benefit to users or aiding worthwhile tasks - and of course enthusing budding developers so that they play and learn and then can use their skills productively. A holistic view of these interactions and interrelations is beneficial.
Computers other than Apples were available in the 1980s, and home-brewed software still occured and left a legacy. At the same time, people who weren't devs - artists, musicians, for example - got to harness computers in a different way. A while later, some fella at CERN found Jobs' non-Apple computer a useful development platform.
So, are we that poor today? Could we have been richer if events had played out differently? I find these these questions, and different approaches to answering them, more interesting than the answers.
I still have the tension of not knowing if the damned game will actually run after waiting an hour for the download... Then I have to play a thrilling game of trying to reinstall Microsoft Runtime Environment 2015 or whatever the hell it is, with no guarantee of success (Civilization VI). First game I decide to play for years, and shonkiness of the process took be right back to the nineties!
The Nord is an OLED screen phone for around £400, when 'flagships' are around £1,000. Yeah, some recent OnePlus phones have been around the £600 - £700 mark, but now they're playing the mid-range game again, why your criticism?
Other notable phones in the £400 area are the upcoming Pixel 4a (October for the UK, likely good camera) and the iPhone SE (known good SoC, long update period, competent camera)
CAMRA in their early days did very important work, but it's been 'Mission Successful' for several decades now - tied pubs were allowed guest ales, and real beer is widely available. (Though law of unintended consequences is that many pubs sold off by the big breweries were bought by pub companies, whose chief business model is squeezing their publicans by always increasing the rent).
Many CAMRA members are now 'tickers' - those who prize variety over quality (if a pub only offers a constantly revolving selection of beers, there is no dynamic that rewards a really good beer since a drinker won't know which pub to go to to find it. Most of these 'craft' breweries use cheap drum-malted barley as opposed to the traditional and labour-intensive floor malted barley). For them it is a hobby like stamp collecting, not a campaign.
Real ales are alive and well, it is our real pubs that are under threat - and CAMRA would do well to focus on that. Pub companies are one issue, as is the beer price escalator that successive governments have adhered to. If they sincerely thought alcohol consumption was a public health issue, they would put more tax on supermarket booze instead of pricing people out of pubs and community engagement.
Of course in the current situation, in many sectors, many people's jobs and passions are under threat. I would note though that regular pub users are able to talk civilly with others of varying lifestyles and political views - unlike many a Twitter user.
You must have read a different comments thread. Most of the people here are either bashing MS, suggesting how Excel should behave, suggesting that scientists should a database, defending the use of excel in this context, talking about cats, or saying things about scientific research culture.
Reading between the lines, most of the comments here are based upon an unease that scientists should have to change their way of working to fit a tool. That most here aren't expressing their unease as outrage actually speaks well of them.
( As for removing loaded, potentially offensive terms, yeah, I might differ from the majority here, but in the appropriate threads I merely suggest to them that experiments and studies have been conducted and that it's worthwhile looking at them. It's not so much offense that is an issue, but continuously and subtly perpetuating views and assumptions that do not withstand objective scrutiny. )
You would lose a lot of talent, skills, various ways of seeing the world, and thus insights, if the only people who were allowed to be scientists were IT literate to a certain level. However, there some issues to be sorted out.
In a similar vein, a good number of working scientists aren't statisticians, and can as a result fall into traps. Some universities therefore employ some full time statisticians for researchers to consult.
So perhaps the answer is to educate scientists in IT enough so that they know what they don't know, and thus seek assistance - and then provide said assistance. Of course, the learning would go both ways, with the IT specialists learning about the sort of issues in IT the scientist in his or her field might come across in their workflow*.
*And that's another thing - if you're at the cutting edge of something, you don't always adhere to an established work flow. You might carefully record data in a logbook (or Excel file) out of habit, but only discover it to be significant down the line when something surprising has happened.
If this issue affects gene researchers, it might also affect others. Whilst the gene researchers have, as a group, decided on a workaround, it would be sensible for MS to allow users to configure Excel to not auto-format when importing files - this way benefitting other users who might otherwise run afoul. Alternatively, have Excel bring up a dialogue box telling the user how to convert the cell type - "Hey, it looks like you might be working with dates in this column. If you wish to convert it to a date format, select the column by double clicking the header and then right click, Format Cells".
Lots of people use Excel for some pretty exotic stuff, including control and automation. That horse has bolted. Some options, clear and well communicated, would seem to be the sensible way forward. Maybe just a choice at install: Are you an accountant or an engineer?
From the 1980s through to the 2000s, each upgrade would bring a significant boost to productivity. More recently, the difference for many users is less noticeable - Photoshop taking 0.1 seconds to apply a filter as opposed to 0.2 seconds, for example, is not world changing.
So, one would expect the Osbourne Effect to be rarer these days.
I use the third party app BXactions (on Play Store) to remap the Galaxy Bixby button to Flashlight (double tap when phone screen off to activate). I paid a couple of quid for the premium version, and for full functionality you need to connect phone to a PC and execute an EXE file in the BXactiibs folder.
Actions (start app, record memo, media controls etc) can be assigned to combinations of phone state (locked or unlocked), and short tap, double tap, long press.
An extra button is a convenience. If Samsung allowed the button to be mapped to any function out of the box, it would be a reason to buy a Samsung phone.