Re: $2700 per kilo?
Wasn't it an Asimov story that had a side reference to a soft drinks company paying to avoid another spray painting the moon with its logo?
100 posts • joined 2 May 2012
Very much so. Have used and discarded no end of 'ergonomic' keyboards. Some years ago my previous favourite died (an MS ergonomic one) and the replacement (same item just 5 years later) was much worse both quality and tactile feel. I moved to a basic one with cherry mx brown switches. Best decision ever although I would look for quieter switches next time - simplay as typing on speakerphone is no possible. Responsive and tactile, typing is much easier on the hands.
I'm sort of trapped in a never ending transfer to plusnet (took my phone number ahead of schedule, no broadband, can't cancel and keep the phone number) but an aside was that I filled in all the boxes that said I would use my own email. They then set up an email in my name and have ignored requests to delete it. So no risk of anyone using that to impersonate me then ....
The value is probably in predictive analysis - ie trend data to optimise pre-cooling and moving to virtual predictive 'simmerstat' rather than thermostat approach combined with any optimal combination of cooling techniques (free cooling, thermal mass, solar, ice making etc).
'decently tuned PID' - getting the decent tuning is that hard bit - and long an AI application in its own right. You also have to managed a network of the things that add noise and complex interactions. That takes significant investment of domain engineers and is rarely worth it in complex systems (like commercial buildings). You also have to manage context drift on those tuned PID controllers.
There are benefits to be had from explicitly doing the predictive analysis (in some senses a PID controller is a simplistic implicit predictor) and a separate constraint based optimiser. That can be done at the system level and with set point adjustment thus be reasonably robust with badly tuned local controllers.
The interesting bit is whether we have reached the point that this can be done with general tech and general it skills and thus save energy rather than every time being a engineering problem. Its long been possible to improve heating system efficiency and add simple predictive controllers - just rarely has it been a sustainable use of the necessary skills.
In data centers of course getting smart whilst being able to rule out over set point overshoot is probably worthy of study all in its own as staying with the thermal design parameters of the kit is rather important ...
"...real time integration of both car, bike, pedestrian and street level systems..."
"What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them."
I'm saying I would have my doubts they can safely (safely enough?) integrate into a mixed environment without such systems and hence incremental roll out of open mixed systems is possibly harder than the public debate might suggest.
'I cannot see any benefit for "smart" devices in your home'
Personally I'm only temporarily abled and practicing now for when I can't reach the radio to turn over from BBC Radio 2. I'm not lazy I'm thinking ahead.
More seriously bringing down the cost by commoditizing this technology for assisted living is fantastic. As soon as my Mum could remember it was called Alexa she wanted one for the radio/music options alone. Try handling discs and buttons and screens with hand morphology issues.
Agreed, wonderful device let down by the integration with personal media - worse the upload crashes on bulk uploads and then has a stuck queue and the support is basically worthless.
That said my personal equation was an echo dot attached to a generally unused hifi system as an alternative to an audio chromecast - been fabulous with some unexpected benefits like a hands free kitchen timer!
If you're already a prime (music) user the echo dot is very compelling although you need to know the name of albums or playlists to get the best (ie play 50 classic 80s works, play 80s music does not). It also works as a platform for geeks and transferring basic on off messages to a hub (a raspberry pi faking wemo via fauxmo) is pretty simple. Finer control looks okay via cloud services but haven't had the time yet.
@commswonk But that's exactly the intent. There was little or no risk to the state funded BBC for novel content. What is a concern is if it then pays premium price for content and on screen talent that could be funded by the private sector, if anything too much of the output is risk free long run programming whereas it makes such a thing about needing a unique funding model to make novel content.
I started using them a while back based on exactly that kind of purchase (add some servos, resistors, caps and leds to your list). Demo'd a set of traffic lights and then got into robotics (asked my daughter what to make next - silly me!)
Arduino's are great for demos and robotics where no header is required but very much hobby/maker and low end bespoke for basic microelectronics. If you don't need solderless headers the various smaller boards are cheap enough to solder in and keep for permanent installations especially robots and the like where wires may pull lose. Haven't played with a 101 tend to use nanos and pro-minis now.
A very different experience but great fun to learn some new perspectives. Glue gun/super glue gel, lollipop sticks, hand riveter some motors and wheels and it's *much* cheaper than lego!
Some very fun educational material growing up around now using scratch style interfaces on top of the IDE too.
The sensible thing to do in that situation is move the portable aircon to fan only and blow air so that you exhaust the room air and bring in the cooler evening , night time air.
Air changes and cooling the thermal mass is key to performance depending on your building structure and insulation levels but you should be able to change the room air pretty fast to the external temperature and slowly reduce the wall temperature down (sometimes called the pseudo constant - but that would irritate too many pedants. And don't get me started on 'coolth' units!)
In buildings with decent insulated thermal mass this is used for pre-cooling too - bring the core temperature a few degrees below and use it as a heat sink in the day.
Free air cooling has become much more widely used these days (including data centres).
It is not correct that you should never use outside air in unless other air quality factors - dust , humidity etc were at play. Even then outside air is brought in, just in a carefully controlled way.
Whilst I might buy that in a near saturated market like the US cars sales could fall, globally the market is probably nowhere near saturated. As such the increase in capital efficiency allowed means the overall efficiency of cars could well increase and therefore use and demand will increase.
You'll note that we all use more ICT rather than less despite the massive increase in efficiencies over the years.
Can we have a debate between Messrs Chirgwin and Worstall?
Funny isn't it how high risk disruptive innovation seems to take high levels of disposable income? Who'd have thought. Then that other unsung hero of competition policy comes into play, incremental innovation and its ill regarded sibling cross licencing. Fair to say its not just the invisible hand of capitalism, downstream its enabled by competition policy and good intellectual property rights.
Seconded - I suspect the major eco benefit of LED will be the very long life - anyone seen an LCA?
Hopefully that will be as true when/if we get a decent 100W equivalent. CFL especially of the expenisve dimmable variety and halogen have very poor life expectancy - switch cycle limits seem the culprit.
Just bought some 60W equivalents for clusters and been very impressed - if they even come halfway close to the expected lifetime I'll be happy - very high ceilings!
If you are teaching physical computing and have some way of programming the microcontroller that's sort of true enough. However that then means multiple computers. As an all in one the Pi is impressive and this makes tinkering on a connected pi much easier and means you only need give access to one device.
Also more device increases complexity and connections and more time is lost handling that in a class room.
They would certainly not be OSI compliant - http://opensource.org/osd numbers 5&6.
However restricting use of open source has been done
It cannot be done purely in open source software to my knowledge but can be done in combination with other software, hardware or service contracts. It can also be perfectly reasonable to do so if deviating from the restriction would create a liability somewhere else in the supply chain or cause issue with the procured services or for example invalidate audit and sampling processes.
The chilli in the cricket club on the grounds used to be quite good too. Wonder if it's still there. My abiding memory of working in the Turing building was that the entire building resonated when Concorde flew over. At a frequency just below a telephone microphones ability to pick. Led to some peculiar phone conversations. Sadly I believe the building along with Concorde is no more.
All good points if the intent was an Apple like premium consumer product play. However if the intent is to take whichever elements of the Google Glass development work are compelling into new platform/services (ie advertising and data capture) it does suggest that it can be done at commodity like pricing rather than premium. In turn it proves that Glass derivations are viable as a platform play.
Yes, Many European countries are based in this system - also known as Napoleonic codes. However this is often balanced by allowing the Judiciary greater interpretative and 'common sense' powers in more junior courts. By comparison our more permissive common law allows little flexibility to do more than follow the letter of the law below the High Court.
The policy was last I looked where the IPR could 'best exploited' (dangerously not defined as either exploited for the economic good or for minimizing public expenditure and TCO) and generally that will be the producer simply as Gov has no IPR licencing practice nor tracking process to detect infringement.
However that was implemented by standard clauses that left the default as the IPR being owned by the procurer often with little or no distinction as to the types of IPR (existing, foreground, residual etc). Its still knocking around as the default in many departments.
In part this happened as Gov made some simple mistakes like procuring reports that it intended to publish whilst the provider thought it was creating a private consultation report which it would resell and when they noticed there work being sent out for free were rather miffed.
It is not really about open source, it's about a making a choice of standard. ODF can be implemented perfectly well in both open source and proprietary - a defining characteristic of an open standard. Indeed MS document their support for ODF, http://office.microsoft.com/en-gb/word-help/differences-between-the-opendocument-text-odt-format-and-the-word-docx-format-HA010355788.aspx .
Training and document conversion is a bit of a red herring. Whatever they standardise on they will have some transformation costs and support for any key exceptions - equally the would have to train to make the average user understand the different file types , associations and extensions if they did not choose. Document conversion is probably overstated, most documents are ephemeral or new versions of old and that can be handled relatively easily.
Rather more of an issue is the weaning off of management by spreadsheet and other uses of extended (standardised and non standardised) functionality - both to improve fidelity and interoperability but also to make mobile solutions easier.
Still missing the point really. Tax codes are complex - our simple one* is a mere 5000** pages.
A multinational will have to comply with every such tax code, and no guarantee of consistency. Blaming companies for contrived tax affairs is a bit like blaming the cup cake for its shape.
1) Simple and consistent codes.
2) Profits should be taxed where economic activities deriving the profits are performed and where value is created. Now apply that to a software R&D lab (that's at least tax deductible anywhere in the world, often super tax deductible in the UK - not least as countries would rather be a locus for highly qualified, highly paid people who you know pay taxes) . The software is created in one place and profits and economic benefits of use elsewhere on either sale or use.
3) Ireland is not in the G20 and has a different approach to taxation - attract employment, tax employees.
Solve 1) , 2) and 3) simultaneously.
Anything that is not legally minimising taxation is by definition voluntary - and companies do a lot of voluntary and charity work and support around the world. CSR, 1%'ers whatever. Giving more money to taxation authorities is of course one of the worst ways to effectively achieve the desired objectives***. Nothing stops employees and shareholders doing the same , and again many do give to charities and more.
If you tax at a 'moral' rate then you have all sorts of non objective analysis on tax matters that will again by definition be discriminatory and imposing unpredictable judgements, further disincentivising investment.
Of course it can't be the G20 finance ministers fault, oh no must be those nasty companies that er provide most of the employment and tax revenues any way ...
*it must be , we have an Office of Tax Simplification and they say it is.
** after amending, redefining and dedupe, actually you may have to comply with up to 17,000 if you have any historical compliance.
*** according to Keynes
Or you could maximise profit and return to shareholders and they can make their own judgement on how much extra tax they pay? On top of you know their dividends and realised equity gains being taxed.
Annoying the tax authorities should not affect the behaviour and judgment of an objective civil servant should it?
An interesting issue here is why go to NESTA in the first place and were any 'real' VCs approached and what did they say.
Whilst NESTA has a track record of talking up its endowment and not much about its actual investments, they were ( at least then before the conversion to charity, may have changed but I still see them as too overtly following the political zeitgeist) always going to have the political and bureaucratic issues of a public sector body. The UK was absorbing about half of all the VC funds in the EC at the time iirc.
A real danger of talking up state investment is the drowning out of other approaches. The state is almost by definition a funder of last resort ( unless it is funding for its own consumption, then its a supply side intervention) if there is a business case a business should be doing it. If it is too long term, too radical or too subject to regulatory risk or that the economic dividend is too dilute then there are good reason for state involvement but this does not seem to be in that category.
Possibly worth pointing out at an early stage that the IP intervention is a *lot* tighter than any public good. Even in the weakest patent system it must at least be novel (so Newton is way out !) . More generally equations and mathematics are not patentable anywhere, only an application derived from them could be.
In the better systems* its only a novel technical effect, leaving most of the pragmatic debate on how to set the the threshold for patentability (in particular of course where technical effect is delivered via an innovation delivered in software), how to ensure a proper review and proper disclosure.
*so not the ones who were once caught defining prior art as *only* anything already patented in their own system ....
Would be interesting to know if there is a decent analysis of the various different patent systems around the world and their effect their differences have on both their domestic and overseas markets.
Biting the hand that feeds IT © 1998–2021