31cm?
Is that 31cm or 30.48cm....
A new satellite set to fly in 2014 will offer the chance to spot objects just 31 centimetres across … from SPAAAAAACE. WorldView-3 will be thrust 617 kilometres into the heavens atop an Atlas V rocket and will boast several different sensors, the better to provide images to customers including the US National Oceanic and …
I am just testing a new compute server for processing remote sensing images (64 cores 512 GB of RAM). My first runs already use up to 480 GB of that, and chug through a detailed analysis of 3.5 Gigapixel image in just over two minutes (was nearly an hour). The new images will be 2.2x larger for the same area covered.
I WANT A TERABYTE of RAM!!!
FFS - get yourself a high end graphics card! 256 cores + 1TB memory (or more) + proper parallel coding using CUDA. £300 will get you the dogs-danglies in the commercial world. If you want mil-aero specification - GE-IP have them for reasonable prices.
Image processing - visible, infra-red, microwave, radar, or all combined is exactly what CUDA on these high end graphics cards was designed for.
"FFS - get yourself a high end graphics card! 256 cores + 1TB memory (or more) + proper parallel coding using CUDA. £300 will get you the dogs-danglies in the commercial world. If you want mil-aero specification - GE-IP have them for reasonable prices.
Image processing - visible, infra-red, microwave, radar, or all combined is exactly what CUDA on these high end graphics cards was designed for."
I assume you mean 1GB not 1TB of memory. If you can supply me a 1TB memory graphics card for £300, I am happy to give you a £300 tip :-). My images start at about 1GB, and end at 1.5 TB (for now). So they will not fit in my graphics cards (with the additional data needed during processing.
Regarding image processing and CUDA: For quite a few image processing problems you are right, in this case you are wrong. Connected filters that we use have a strictly data-driven processing order, which does not work well in CUDA. Indeed, because the outcome for every pixel depends on every other pixel (potentially), parallelization itself is hard (see the first method with can be found here (warning, pdf)). On 64 cores I am now getting about 30x speed-up. I am trying to adapt this to CUDA (or OpenCL) in collaboration with the CSIRO in Sydney, Australia, but we have not got it running yet.
" it is possible to spot objects just 31cm long, almost exactly the length of your correspondent's size 13 flip flops, from space."
Oooo - not one to pass up a good co-incidence when it comes by, might it be a candidate for a Reg unit of satellite resolution, the Flip Flop (ff) ?... e.g.
"That's nothing, the 1960's spy satellites had a resolution of 0.1 ff"
Wasn't there a suggestion a while ago to try and use successive hi-res overhead pictures to try and spot areas of disturbed ground in Afghanistan, which might indicate a recently laid IED ?
Since such devices are mainly roadside based, with a bit of processing, this sat could snap targetted areas (roads) today, do the same tomorrow, and look for tell tale signs ?
Hang on, the first paragraph says that the satellite will be able "to spot objects just 31 centimetres across" ,
But the last paragraph indicates that the 'resolution' will be 31 cm.
Which is it? Those are not the same thing. If it has a 31 cm resolution, you will be able to spot objects rather smaller, although they won't resolve into clearly seperated objects. If you can (only) spot objects just 31 centimaters across, the resolution must be rather worse than 31 cm.
>A new satellite set to fly in 2014 will offer the chance to spot objects just 31 centimetres across … from SPAAAAAACE
It'd be a damn clever satellite if it could spot objects from elsewhere.
And just in case we are in doubt about where it's taking these images from
>A little sharpening of the images will mean it is possible to spot objects just 31cm long, almost exactly the length of your correspondent's size 13 flip flops, from space.