* Posts by JamesMcP

38 publicly visible posts • joined 14 Feb 2011

BOFH: Looks like you're writing an email. Fancy telling your colleague to #$%^ off?


Re: Life imitating art

Well...guess we can be sure that the AI read TheRegister's copyrighted materials....

The truth about Dropbox opening up your files to AI – and the loss of trust in tech


The Dropbox CEO is lying

I have dropbox and my AI features were enabled.

I occasionally use the Android app as everything else is via automatic sync, and I guarantee you there are no "clearly labeled" AI services on either of those software packages.

And it's not just me. "multiple Ars Technica staff who had no knowledge of the Dropbox AI alpha found the setting enabled by default when they checked." https://arstechnica.com/information-technology/2023/12/dropbox-spooks-users-by-sending-data-to-openai-for-ai-search-features/

One door opens, another one closes, and this one kills a mainframe


Six feet deep and rising

I worked at a Fortune 100 healthcare company in the early 2000s that was on the very edge of the midwest in the USA near where they have a very large horse race and some incredibly pure tap water that comes from limestone wells. (That last part will be important) I was not IT, but I was highly dependent on many IT systems and had friends in various IT departments so I heard various bits of this story from six different directions. (Some bits may be modified slightly to protect the guilty)

For reasons no one could rationally explain, a data center wound up in a basement. And in a highly predicable fashion, two months before the DC was to be relocated in a new facility, an 8" fire line on the 3rd floor blew out. It took hours to get the building water supply shut off and by then the DC was literally neck deep in water. By chance (or likely laziness) all the UPS & power transfer switches were located on the floor above (I think they cannibalized a loading dock) so there was not a horrific >BANG< or >ZZORZTCH<.

Fire marshals evicted everyone from the building but as there had not been a >BANG< or >ZZORZTCH< they let the power stay on. In Theory this was just to catch the last business day's worth of processing and push that over to the DR facility.

Reality & Murphy grabbed beers while they pointed and laughed at Theory.

There was a migration plan, multiple backups, offsite disaster recovery, and all the business continuity stuff you would expect. What the cynic in particular will expect is that none of it had been given a proper test so it was all FUBAR. The backup system was pointed at the DEV servers, The offsite DR copy had been used for a data anonymization project that had accidentally removed all user identifiers, and various other Shakespearean comedy of errors had transpired to render it all a dumpster fire.

But for some reason, the servers were still online, their lights twinkling merrily under all that water. A frantic data exodus began. Various network hacks were added to increase bandwidth, cables were plugged in through above-ground windows to pull data out through the building LAN to laptops with external drives, you name a desperate stunt that the Fire marshals would allow and they did it.

The magic is that for four days the lights stayed on in that data center, during which every bit and byte were streamed out of that DC. The new facility was booted up and business functions resumed in less than a calendar week.

AFAIK, no heads rolled. All the senior IT staff had seen the potential for doom had kept plenty of receipts. From the whispers I heard, various VPs who had demanded things like the anonymization project be done on crash timelines in violation of the corporate processes but gotten sign off from the C-suite.

You've just spent $400 on a baby monitor. Now you need a subscription


Re: "the sudden imposition of subscription fees"

Zwave and Zigbee are not IP based. Their attack surface is minimal as you have to be within 100ft (or less) of the home to attack it. And then you need a dev kit or software defined radio.

I have 80 Z devices and only the main controller has an IP address.

If you like to play along with the illusion of privacy, smart devices are a dumb idea


Re: please forgive my lack of knowledge...

No need to hack WPA. Once they are on the network, they could act as a wifi man-in-the-middle.

E.g. you come home, your cell phone looks for SSID "HOME" but your smart fridge is also broadcasting "HOME". It accepts whatever WPA password your phone supplies and then acts as a relay between your phone and the real router.

Given that many smart devices have multiple 2.4ghz radios to support Matter/Homekit (bluetooth, wifi, maybe thread) it is much more plausible now than before.

This is still black-hat territory, but the more IP devices you have the more likely one of them can be co-opted.

Snowflake's Instacart protestations hint at challenges for poster child of the data cloud


"Data Lakes" are generally piles of files, usually something like JSON or parquet, and are pretty raw. A "Warehouse" is a big database of sanitized and "blessed" data. Typically you use two different sets of tools/apps to access these.

A "Data Lake House" is a single portal to both kinds of data. I.e. queries can hit the files and the tables at the time.

Snowflake explains that Instacart's bills aren't melting – it's called 'optimization'


Growth uber alles

I bet you that what happened was they focused development on expansion to gather more business and only decided to optimize after either a) feature requests slowed down or b) cloud fees started looking really big.

With Snowflake and other "serverless" technologies they have advantages; they can essentially see how much each query costs, which is what you really care about. (On reserved instance servers you have to do more analysis to figure out why you need a SuperDuperBig node instead of a MerelyBig node.) Then slogged through which queries cost them the most money per year. That means frequency x unit cost. Odds are the most expensive queries were ones that were pretty quick to run but were highly compute intensive and used all the time.

The thing about optimization is it often feels like a waste because often time one thing gets marginally worse in the process. E.G. "This code runs in 700ms and uses 14 cores and we run it at least 1M times a day at $0.0001 per use so this one query is $36,500/yr. We refactored it so it only uses 1 core and costs us $5,000/yr, but it runs twice as long at 1.5s"

Doing that for one query is a nice $ savings and not a nuisance to users, but if you do that to 5 queries that are generally used in concert with each other, you've saved ~$150,000/yr but also turned a 3.5s process into a 7s process, which may wind up driving away customers or lower productivity elsewhere.

IBM, NASA emit actual open source AI model – for grokking Earth satellite images


Good use for AI

This is actually where I think AI is best used. Doing something that is too time consuming to ever consider doing by hand, even humans are doing some "well, that looks like wheat" guesstimates, and it provides a meaningful benefit that couldn't exist without it.

Can this be used for evil? Probably. Pretty sure there's a way to make Nerf do evil. This at least is chock full of positives and you've got to work to make it bad.

Linux has nearly half of the desktop OS Linux market


Re: If ChromeOS is Linux...

Ahhh, slackware.

My first foray into Linux was on a 3-day weekend, using a 1-floppy slackware router-bootstrap config on a 486 pc with a linux-blessed 3com ethernet card connected to a 256k ISDN line. It was a steady string of downloading packages/libraries, compiling, reading man pages, and repeating. 30 hours later I had consumed 2 pizzas, 6L of soda and had working GUI with audio.

Very educational process and I think more people should do it just to understand the hardware stack.

Nobody does DR tests to survive lightning striking twice


Correct, but not the general reason why lightning hits the same place twice, since few strikes hit twice in the second or so that the ion channel exists. (Or more accurately, most lightning strikes actually consist of more than one discharge that happens in rapid succession while that channel exists)

The real reason lightning tends to hit the same place more than once is the same reason your finger and the doorknob get more static discharges than anywhere else in your house. Weather patterns exist, just like you shuffling from your bed to the kitchen, which tend to accumulate similar charges, and there are only so many physical locations that fall under that pattern. Some of them are simply closer and of more appropriate materials/shapes. Ceramic coffee cup? Insulators by definition aren't good electrical conductors. Brass doorknob? Excellent.

Similarly a flat grassy field has a much lower ability to concentrate a charge than a house with a chimney covered in sharp edge. (Sharp edges and points are better charge concentrators)

So, until there is some kind of new construction that changes becomes a more attractive lightning rod or the old lightning rod finally gives way to a couple of gigawatts of power or climate change alters the weather patterns, lightning will keep striking the same place.

Hence that one house in every town covered in lightning rods.

Mars helicopter went silent for six sols, imperilled Perseverance rover


Was it really "imperative"??

"With the rover on the move, and the helicopter stopped, it became imperative to get Ingenuity moving."

It's a helicopter expected to make 5 flights. Why is it's continued operation "imperative"?

I suspect the missing context is "before the rover moved beyond the helicopter's maximum radio range and it would have to be abandoned in place" but there's some implication the rover depends on the helicopter. While I am sure that the helicopter enables the rover to take more efficient routes, the rover was expected to operate sans chopper.

Scientists speak their brains: Please don’t call us boffins


As a "merican, I thought it was explicitly an insult for quite a while. It sounds diminutive and dismissive. Even when I found out it was "egghead" or "braniac", well, I don't really find those terms to be used respectfully either.

Twitter engineer calls out Elon Musk for technical BS in unusual career move


Re: Sooooo....

Musk explicitly stated it took >1000 batched RPC calls to for a timeline to load. Frohnhoefer pointed out in a subsequent tweet that the app makes 2-3 requests for a timeline to load, ergo the performance problem (which was never disputed) is not due to the number of calls (RPC or otherwise) but what the app is doing (e.g. "rarely used features") and/or waiting on (aka network response).

UK facing electricity supply woes after nuclear power stations shut, MPs told


Re: Funny

Interesting factoid from a study by Oak Ridge Nuclear Laboratory in the 1970s was that the low concentration of fissionable materials in coal (~1ppm) turns out to have about the same energy value as the coal itself. (If you didn't know coal had fissionables, surprise! Coal produces the carcinogen radon, because coal contains fissionable materials that were concentrated by plants umpteen billion years ago, which were then further concentrated into coal)

So if a very hypothetical 10MW-yr powerplant uses 1M tons of coal, there would be ~1 ton of nuclear material (1 part per million) which is enough to produce 10MW-yr. (all of these are round numbers as exemplars of the study's summary, not actual fuel:power ratios)

One thing that tells us is that you can grind up nuclear waste and mix it with some wood ash and then treat it like you do coal ash. If that horrifies you, maybe you should look into the millions of tons of coal ash hidden across your country. No, it probably doesn't matter what country you live in, most of them are full of coal ash.

Another thing that says is that we could in theory extract the fissionables from all those millions of tons of coal ash (which is actually denser than the original coal after the carbon is burned off) and have the equivalent power as all those coal power plants once produced. And then the nuclear waste would have less radiation than the original coal ash as we've turned that into electricity.

Massive solar project in Tennessee is all about Google


Re: Nuclear power is *not* renewable power

Nuclear is as renewable as solar as both harvest power from nuclear reactions (fission vs fusion)

As for the pollution of nuclear, it's a piffle compared to the single greatest source of radioactive waste: coal

All coal contains fissile materials, which is why coal country is also the land of radon gas, a breakdown product of uranium and thorium. Various studies have shown that if you could harvest all the uranium, plutonium and thorium in coal, nuclear reactors would produce as much power as you would get from the coal plant. (i.e. the coal to provide 100MW/yr contains enough fissile materials to produce 100MW-yr)

The net result is that all coal power plants produce far more nuclear waste than any nuclear power plant. If the radiation monitors that are installed downwind of nuclear plants were installed downwind of coal plants, they would constantly register elevated radiation levels. All that nuclear waste is "contained" in the coal fly ash. A substance which is known to be toxic due to the heavy metal content (including fissiles).

If you are fine with coal ash waste, the solution to nuclear is simple: build a wood fire power plant next door to the nuclear plant and then blend the spent nuclear fuel with the wood ash until you have something with the same toxicity/radioactivity as coal ash.

Insteon's vanishing act explained: Smart home biz insolvent, sells off assets


Insteon devices are not IP based other than the hub

Only the hub is IP enabled and it can still be leveraged locally without the cloud.

The real issue here is that it is a proprietary technology. Replacements are a long term problem but the lack of bridge devices are a critical failure point. Insteon is a mix of powerline tech (x10 compatible) and 900Mhz radios. If you have 80 devices and lose your bridge....you have 80 useless devices.

Oh hello. Haven't heard much from you lately: Linux veteran Slackware rides again with a beta of version 15


Slackware on a floppy plus ISDN....

Back in the mid90s, I set up my first linux box. I'd been using linux in the labs or at work since '91 so I wasn't a total neophyte but I'd never done an install.

I started with a 1.44Mb floppy disk with a "router" build of slackware. I used it to boot my 75Mhz AMD PC and was able to connect to the LAN and (via ISDN) the internet. It had lynx and an FTP client.

The 128k ISDN (bonded channels!) and using two sessions let me download in one session while reading man pages and make my next action plan in the other. About the time I figured out what I should do, a package was downloaded. Tar, gzip, gcc, sources, libraries, binaries, I compiled and recompiled so many times.

Eight hours, hundreds of MAN pages, a large pizza and 4L of carbonated sugar water later, I had a recompiled kernel running XWindows in my monitor's native resolution & refresh rate (score!) with the soundblaster (SB16 maybe...) able to play Torvald's "I pronounce Linux..." file and a web browser (Netscape 1.0b I think).

I still count that as one of my "geek" milestones.

Dear makers of smart home things. Yeah, you with that bright idea of an IoT Candle. Here's an SDK from Amazon


Re: Harmonisation of Circles of Hell

The Zwave/Zigbee don't. They don't speak IP.


Zwave and Zigbee are not IOT

While they are often lumped into "IoT", the Z devices don't use IP or TCP so it's really hard for those Ts to be on the I. There are implementations of "Zigbee over IP" (aka dotdot on IP) and ZWave over IP, to date neither has seen any consumer adoption, which puts it right next to Thread, which AFAIK is only available on Nest/Google branded devices.

CHIP may change that for Zigbee as dotdot is expected to be part of CHIP. Dotdot is also going to run on top of Thread. At which point Thread is just a networking stack and ceases to be a full automation standard. It will also be useless as Thread devices without relays won't have any more battery life or range than BlueTooth.

The important parts of the Z protocols are the device profiles and the device enrollment. This is the "dotdot" component of Zigbee. It is what allows devices from different manufacturers to work without needing a custom driver/handler. Both of the Zs have fairly comprehensive arrays of profiles already. Both standards also allow manufacturers to add new commands, assuming they aren't just reinventing a wheel.

Part of the reason Zwave has a following is all Zwave devices are certified to follow the standard so for the most part, they are as interchangeable as USB mice.

Part of the reason Zwave hasn't "won" is that certification costs money so devices are more expensive.

Part of the reason Zigbee has a following is that certification is optional and anyone can make anything, which results in very cheap battery powered bits from China.

Pat of the reason Zigbee hasn't "won" is that there isn't just one Zigbee, there are multiple flavors, so it's entirely possible to buy Hue bulbs (Zigbee LightLink) that won't play nicely with your Xiaomi sensors (Zigbee HA).

The IoT wars are over, maybe? Amazon, Apple, Google give up on smart-home domination dreams, agree to develop common standards


Re: Connect directly to the Internet?

That's more or less Zigbee, an open standard with enrollment, encryption, commands, device profiles, etc that is extensible and cutsomizable.

And to a lesser extent Zwave, as it supports vendors extending new commands and parameters,although its more managed like USB & BlueTooth.

So this exists ,but the big names have avoided it because it a) requires a second radio, adding costs b) requires following someone else's spec, and c) they can't control the products.

Going to a wifi-based spec (aka Zigbee over IP) means these companies don't need that second radio. It means the bulk of validation becomes standardized, which can lower certification costs and hassles (which I'm sure is contributing to Homekit's woes), and they can still add an extra chunk of code that operates at the TCPIP layer to do non-automation functionality (i.e. voice assistant-y audio) as well as potentially being a "gatekeeper" so that HomeKit won't bother controlling anything that doesn't also have an iHome authenticator key.

I don't want to go on the cart! Windows 10 Mobile hauls itself from the grave one last time


Sadly, it was a good phone UI, just crap everywhere else.

I picked up an incredibly cheap WinPhone a couple years back (US$25 maybe) just to play with it on wifi. It was surprisingly snappy, even more so for having around 1GB of ram and a Snapdragon 200 or so. The UI was easy to see and interact with, especially on the smallish screen. I could have happily used a WinPhone with a better camera as my daily driver if I hadn't already seen the writing on the wall. I had already rode the good ship WebOS all the way to the breakers, wasn't going to do it twice.

Sadly, MS had burned all potential bridges and by applying a phone UI to PCs, servers and laptops, had effectively salted the earth. No consumer wanted anything to do with Windows on the one form factor that it actually made sense. (Ok, tablets were good with metro but the tablets didn't have enough horsepower or battery life so even if the UI made sense, the experience was blech)

Multiple form factors require multiple interaction modes. Until someone figures out a highly flexible, scalable UI framework and can strong-arm the app devs into using it, we'll remain with balkanization between form factors.

Den Automation raised millions to 'reinvent' the light switch. Now it's lights out for startup


Go with Zwave, Zigbee or Insteon

Use a proper Home Automation (HA) platform and you'll get what you want. You will have your choice of control software/hardware. I use HomeSeer (they've been around 20yrs and counting) but Universal Devices and Hubitat are also good choices for pre-built, internet-aware but NOT dependent controllers. Or you can build your own hardware, buy control software like Homeseer, CQC or Indigo or you can go full open-source with HomeAssistant, OpenHab and NodeRed.

I prefer zwave as Zigbee is going to have mostly incompatible flavors for the next year or two (Zigbee LL, Zigbee HA) until Zigbee 3.0 is shipping in volume.

Insteon is a mix of radio+powerline (x10-esque). It's single source, so basically the Apple of HA, but generally positively regarded and the powerline signaling lets it work better in structures with stone or brick interior walls.

If you absolutely must use wifi devices, get ones that support MQTT. At least then you cut out the middlemen on your potentially insecure, always-on, IP addressable micro-systems that are able to hear all wifi communications in your home. .

Osram's Lightify smart bulbs blow a security fuse – isn't anything code audited anymore?


The Osram rep is lying when they say that flaws in zigbee protocols are "unfortunately not in Osram's area of influence."

Aside from the fact that zigbee can be heavily modified by Osram, way back in 2007 the DoE published a paper descibing how to secure a Zigbee network from replay attacks.

(links below)

They could have used the secure zigbee settings but just like their wifi management, they screwed it up.



Apple Watch craze over before it started: Wrist-puter drags market screaming off a cliff


Re: @Bob Dole ... Could Also Be...

My pebble Steel & Time Steel watches last @ 5 days a charge with pretty significant usage. My dad's classic Pebble, who mainly has it to keep from forgetting his phone or missing a phone call, gets more like a week of usage. If I run the new one down to the edge of power, it switches to a simple watch display that lasts another day or so.

I like the fact I can turn up and down my thermostat without having to get out of bed.

Oz uni in right royal 'indigenous' lingo rumpus


So the British refer to the 1940s Germans as "settlers" rather than "invaders"?

And the Irish considered the English to be settlers and not "invaders"?

And the Celts considered the Romans to be settlers and not "invader"?

And the Cherokee considered the europeans to be settlers and not "invaders"?

Human history is chock full of invasions. Let's call a spade a spade. The invaders like to call themselves "settlers" and the invadees refer to the invaders with invectives.

White-washing* history is comparable to polical correctness, except white washing is seeking to actively reframe the situation in a fashion that makes the historical victors morally and ethically pure. Political correctness, while irritating, generally seeks to avoid using emotionally charged language.

*it's a kind of paint not a skintone-based perjorative.

Ex-Palm CEO Rubinstein wishes HP sale never happened


Re: That was a genuine waste

The sad thing is that it was less that there were performance issues with the software as much as that there were too many features for the hardware.

The Pre had the same internals as an iPhone 3GS and came out a week or so earlier. But the 3GS didn't have full multitasking. Heck, the iPhone still doesn't have full multitasking and won't until IOS7 is released. The iPhone didn't have a customizable GUI framework. The iPhone didn't have the integrated address/contacts/messaging of Synergy.

And all those things the iPhone did NOT have made the iPhone that much faster for what it did do.

Android had the multitasking but not the synergy component and had the advantage of a half dozen manufacturers tweaking hardware and software, some of which would flow upstream to AOSP. Which meant Android was able to get better faster than WebOS.

The real performance drag of WebOS was that the GUI was web-based HTML & javascript, which wasn't that optimized 4 years ago. Of course the reason it was "Web"OS was that the original GUI was a total trainwreck that had to be rewritten in a do-or-die effort and the devs went with HTML/Javascript because it let them use off the shelf open source code for graphic layer and toolkit, was fast to build, fairly pretty, and easily updated.

I loved my Pre, and software tweaks kept it livable. The Pre2 was a much more enjoyable experience, with no more screen lag or stutter than Android. Adding WebOS2 to a Pre- did give a noticeable lift in speed.

I left WebOS becuase it was abandonware. Without any real support or upgrade path I decided it was time to move on intentionally instead of waiting for my devices to break. With no good physical keyboard Androids, I went to the Note 2 and, to unify my platform, am running CM9 on my HP Touchpad. Ironically, many many apps are better/prettier/easier to use on WebOS (Zite on WebOS is awesome as is Weatherbug) and the usability of webos ALMOST makes up for all the unsupported/unoptimized apps (like Adobe DRM, soooo sloooowwww)

SpaceX satellite burns up on re-entry after Falcon FAIL


Rentry: the action of reentering the earth's atmosphere after travel in space

As defined by Webster's website & dictionary.com.

Which is logical. Even if the atmosphere was "assembled" around the solid matter on the Earth, that matter was IN the atmosphere. Then it left. Now it's coming back.

HP boss Whitman: 'We have to offer a smartphone'


Blame the board of directors

HP under Mark Hurd had, for the most part, turned around after the Fiorina era. Hurd was, apparently, a little sketchy personally but not enough to justify firing. The board, however, didn't like his direction and brought in (ugh) Apothecker and it all went pear shaped.

Hurd was smart enough to say that Palm/WebOS was a long-haul proposition and was not expected to set the market on fire. So when Apothecker killed Palm/WebOS because it didn't set the market on fire....face-palm.

I think Hurd's ultimate goal was beating RIM and eat the corporate space, rather than trying to defeat Apple & Android. Tie WebOS into the HP management software suite and you could kit out an company from top to bottom with HP tech (cellphone, laptop, PC, server, printer, SAN, LAN, management, etc).

I'm a WebOS fanboi so if HP did decide to release a Pre4 (or just mass produce more Pre3s) I'd buy it.

However I expect that won't happen and that I'll migrate to Android 5 when my Pre2 finally dies. At least by then Mattias Duarte will have added some more WebOS-ification to Android. Maybe Android'll get card-type app management.....

HP hands in-house Android code to TouchPad tablet hackers


HP had Android printer/tablets prior to Palm purchase

They didn't sell well, but I'd been eyeing it as a possibly cost effective network printer & tablet, assuming the tablet was rootable.

It's no surprise that the team responsible grabbed a pile of touchpads and tested their software stack on it. They'd kind of be stupid not to. The surprise is that their test software package got flashed onto shipped hardware.

Why there's real hope for webOS - if HP is committed


Drivers, drivers, drivers

HP needs, desperately, to get a library of drivers built for the majority of components. Personally, I'd target the WinPhone hardware specs first, since most manufacturers have one or two WinPhone devices they could use to experiment with WebOS and that haven't been flying off of shelves.

If drivers for new platforms like nVidia's Tegra2 aren't forthcoming, we'll be stuck with devices based on the current Qualcomm and TI chipset families. That will exile WebOS to the Coby and Pantech devices.

HP should also get some Intel involvement. Intel doesn't have an OS for their MID devices. Meego went nowhere and WebOS at least has thousands of apps, developers, and several hundred thousand users. If Intel's reference designs ran WebOS, it's possible a couple of those Cobys and Pantechs of the world will be willing to produce that reference model,which would be at least an upper-middle tier product.

HP readies fresh WebOS update

Thumb Up

They have to release updates if they want to sell....

If you're buying an OS to distribute, you also want to hire the programming team. If HP is going to keep those programmers on the books, they may as well put out software updates that at least keep the technological infrastructure maintained.

If the buyer winds up buying/licensing the Palm patents then HP can lay off the programmers.

Besides, all signs point to HP wanting to standardize on WebOS fot applicance-type OS uses (aka printers). I'm guessing their printer group has already migrated all their unreleased, high-end products to WebOS. Lord knows the android-printer didn't fly off the shelves.


There's a patch for that

It's a little odd having the left & right arrows on the upper row but it makes the text editing much easier. All in all, I'd say that the WebOS keyboard layout is the best one I've encountered.


The thread it links also gives some information on how to tweak your own keyboard variants.

Bill Gates discusses nuclear development deal with China


It's less waste radioactivity per MW/hr than coal

Hard to believe but true. From memory, every 1 million tons of coal has an effective power output of ~35MW-yr and contains a couple tons of fuel-worthy radioactives (this equates to a couple of parts-per-million). The effective power output of those ~3 tons of thorium/uranium/plutonium is about 37MW-yr.

See that? 35 MW-yr of effective power from the coal, 37 MW-yr of effective nuclear power from the "junk" radioactives in the coal. There's more effective nuclear energy in coal than chemical energy. So coal produces MORE radioactivity in its waste per MW generated than nuclear. Lots more, actually, since the coal radioactives haven't been "burned" for power. At least 37MW-yrs of power (aka radioactivty) was extracted and converted into electricity from the 3 tons of fuel. The materials left in the coal ash have only naturally decayed.

The difference is concentration. If you took the waste from the hypothetical 3 tons of fuel above and tilled it into 500 tons of wood ash, you'd wind up with something very much like the coal ash from the hypothetical 1 million tons of coal. And yes, coal ash is much more radioactive than coal, which is significantly more radioactive than dirt.

So we don't have a nuclear waste surplus, we have a wood ash deficit.

Nervous Samsung seeks Android Plan F. Or G, H ....


Time to license WebOS and some patents

I think Samsung needs to call HP and get a price on WebOS and the Palm Smartphone Patent.

Why bother developing Bada when there's a completely functional mobile OS up for sale that comes with it's own patent-law nuclear weapon and at least some semblance of an ecosystem?

Coders breathe Android into dead HP fondleslab


Shouldn't be that hard

WebOS devices can easily have the ROMs reloaded via the WebOS Doctor tool, which is an offical Palm app. (HP having rescinded their ownership) Plus WebOS is Linux. WebOS QuickInstall already has instructions for loading Ubuntu onto WebOS devices. As long as you can get Android drivers for the components all is good.

WebOS 3.0 was your typical x.0 release: slow and desperately needed the 3.02 update. 3.02 is pretty snappy but it is boosted by adding the patches from WebOS QuickInstall that disable the excessive logging. I wasn't thrilled with WebOS 3.0 until I ran the update and applied the patches. Now it's buttery smooth. (sigh)

Not sure why people crap on the hardware. The plastic case isn't specatcular but the screen is on par with an iPad, it has a GB of RAM and equipped with dual-core 1.2Ghz CPU. The exact same used in an HTC Evo 3D or Sensation. All in all, it's CPU-comparable to an iPad2.

Rural white space wireless standard signed off


Cost effective

It isn't cost effective to run fiber in rural areas. I live in the "suburb" of a rural town and my street has ~10 houses per mile. Head another mile down the road and the population density drops.

As a result, calling this "indiffferent" is just nuts when compared to areas that are currently served by dial up or satellite. I had satellite internet and while I got DSL-like bandwidth on large file downloads, the latency (1-3 seconds) rendered it barely faster than dial-up for general usage. I finally got DSL when lightning toasted the area's switch and the replacement had just enough signal to get me 768Kb.

Depending on collision avoidance, channel splitting, and the over-subscription factor, each tower could offer DSL-like performance to a couple hundred subscribers. Assuming each subscriber is a family/business, that serves a few thousand people. The tower is likely half the up front cost and less than 10% the long-term maintenance cost of fiber.

HP launches webOS products, but no ecosystem


What does the author think "a developer ecosystem" is?

AFAIK, it consists of development tools, a channel that can be monetized, and a userbase.

HP has the first two. WebOS development tools are reputed to be pretty nice for both the "light" HTML5 apps and the native apps. The App Catalog is now international

Devices & users in the market are somewhat thin; active WebOS devices are probably number in the low tens of millions. However if HP can push out enterprise sales of phone + tablet + laptop/desktop + servers, those volumes will increase. Plus HP has plans to make WebOS available on PCs. I don't expect that to be full-on Windows-free devices. Remember that most WebOS apps are HTML5-based so they can be run in HTML5 browsers with a plug-in or two.

So imagine HP bundling a Chrome-based WebOS environment with every of their PCs along with a cloud-sync app to make sure the PC, phone and tablet have a single data set. Even 3D games could be supported with the emulator from the SDK. Make it a free download for users without HP pcs.

If a WebOS package like that was installed on 25% of HP PCs shipped worldwide, that would be an installed userbase larger than OSx and iPad combined. Any phone/tablet sales would be icing on the cake.


Navit for WebOS - free turn by turn

Navit is a 3rd party, open source, stand alone GPS app with turn-by-turn navigation. It uses the OpenStreetMap (OSM) data sources, which include Europe. Matter of fact, it's irritatingly european for us 'Mericans because there's no non-metric option.

It's a homebrew app, available from www.webos-internals.org for free.

First time homebrew users will need to download the desktop app "WebOS Quick Install" from the same site, and then install PreWare. PreWare is an over-the-air installer, aka the "homebrew app catalog" app for your WebOS device. Once you have PreWare on your phone you will not need to connect your device to a PC to install apps again (barring a major software update that requires a new version of PreWare)

All the apps, patches, services, and tools on PreWare are free. There are some ~1,000 app and another 400 patches that tweak the behavior of WebOS.

Oh and if you're miffed that it's homebrew, that's because the spoken directions require a software service that can hijack the audio feed. That's not in any of the existing WebOS 1.4.5 APIs so it can't go in the official app catalog. WebOS 2 or 3 may change that.