* Posts by InITForTheMoney

42 publicly visible posts • joined 9 Jan 2010

UK Ministry of Defence tries again to procure £1.7bn tri-service recruitment system

InITForTheMoney

Crapita

Once upon a time, 'Eg Reg' wouldn't have missed the opportunity to write Crapita instead of Capita. Either standards are slipping, or the world is becoming nostalgic for when services were only as bad as they were when Capita was running then.

What do you call megabucks Microsoft? No really, it's not a joke. El Reg needs you

InITForTheMoney

OfficeZilla

One constant all this time... Office. Still lacklustre after all these years.

Alternatively: Patch Tuesday Corp.

Surprise! British phone wins Best Product at Mobile World Congress

InITForTheMoney

Re: Back on track...

EU advertised prices include Value Added (sales) Tax at around 20%, the tax rate varies between about 18% and 25% by country, however if you buy online, you pay tax at the rate of the sellers country, not your own country, so the advertised price is the price you pay and it's simpler for us, because we dont have to think about adding taxes to work out what we pay. In the US, your advertised prices don't include local taxes and you add the tax according to your state, at the point of purchase.

If you add 20% to the dollar converted price, it comes in at around €651 (based on todays exchange rate of 1.1013 USD to 1.00 EUR), which is the advertised price in Euros. So no... Europeans aren't being 'overcharged' by our manufacturers.

How do we train the next generation of data centre wranglers?

InITForTheMoney

Demise and rise of the IT generalist

As a contractor in IT Infrastructure who moves between clients fairly frequently, I am often pretty shocked by the comparitiviely low level of expertise of the Data Centre operations staff that I meet on my clients sites, I'm not complaining, it keeps me in a job, but I wonder about the sustainability of it. The problem I think stems from the way that IT departments are structured now. I am not conviced that there is a pathway that really enables traning in IT Infrastructure, because there aren't really any basic jobs that are very useful at educating staff in the more complex stuff anymore.

I'm only 33, but when I took on my first IT support job in the year 2000, it was doing break/fix desktop and server support, we had a mix of 10BaseT and 10base2 networking across the 3 floors of the office building, we also had a Citrix MetaFrame farm of 2 servers that supported about 60 thin clients that were old 386 and 486 PC's running a DOS based Citrix ICA client, there were then about 40 high end CAD workstations that used a mix of WinNT and Win2000 and accessed a few published apps via Citrix.

In order to support this environment it was necessary for me to understand the basics of the various Windows operating systems, of DOS, of Citrix and a little of Linux. I then also had to undertsand plenty about the various different network types and protocols that we operated in the environment (primarily TCP/IP and IPX), about bridges, routers and switches. There was a team of 3 of us and we did everything from supporting the servers, configuring the firewalls, building / imaging the PC's and 'thin clients' and running through the office moving terminators in order to locate faults in and fix our decrepit 10Base2 network. In this environment, I learned the basics and root of almost all of the IT knowledge that I use today, everything I know now is an extension of one of those concepts, however, I think I am one of the last people that had this type of generalist 'apprenticeship' in IT.

The fact is that people dont gain this type of experience now, if you have a fault with a PC in the office environment, you re-image it or replace it, you dont diagnose a fault with the hardware or software, you just reset things to a known good configuration and move on to the next problem. The office IT department has been relieved of most of it's responsibility and major applications and services have moved to the Data Centre. First line support calls, instead of being attended to by the local office IT person, are now dealt with over the phone from call centres, these call centres are miles from the data centres that would provide a career progression path for those staff, meaning that there are few junior positions in IT support and a massive skills gap to cross in order to work in the complex parts of our environments.

In the Data Centre itself, the industry has in the past encouraged staff to specialise in specific areas, there are few IT generalists now, people have instead become compartmentalised in to teams according to skills. What this means is that people who are young blood in IT now, do not get as broad a spectrum of experience as their predecessors, what they instead learn is how to implement the specific products that their employers use, often only in the employers specific usage scenario.

This is a problem, not only because those products themselves become obsolete (and so the skills associated with them have a finite life and value), but because things have already been changing for some years in a way that means that this specialising is already hurting both staff and businesses... things are becoming more integrated and people once again need to have knowledge of many areas of IT as things like Virtualisation begin to require knowledge of many disceplins, including networking, storage, compute hardware, operating systems, services, hypervisor technology and management suites, the problem is that usually most support staff have only knowledge of 2 or 3 of these and if they have been in IT less than 10 years it is harder to explain to them the concepts required to understand the pieces that they are missing, they often lack the experience that provided this base knowledge, by starting their careers in a specialism.

The question is... how do we fix this problem? Vendor training is not the right approach, because vendor training centres around products, but how do we make sure the next generation of IT staff is able to understand and adopt new technology quickly?

Is Google building SKYNET? Ad kingpin buys AI firm DeepMind

InITForTheMoney

But before having a cup of tea and a pint (or 10), those British brained robots just might invade 88% of the world and enslave / murder / rule a good portion of humanity.

http://mentalfloss.com/article/13019/there-are-only-22-countries-world-british-haven%E2%80%99t-invaded

I for one welcome our new robot overlords, their robot King / Queen and their British grammar and spelling.

Fanbois, prepare to lose your sh*t as BRUSSELS KILLS IPHONE dock

InITForTheMoney

Bet Apple have already gotten around this in advance...

I remember reading early versions of the EU standard that specified that the charge brick must present a USB A port. You should note that at present, Apple sell the Charge Brick and the lightning cable as separate items, they don't sell a charger with a cable.

They will continue to sell the lightning cable as a data cable, separately to the charger. The charger complies with the regulation by presenting USB A. The cable will get around it by virtue of it's primary function being a data cable, it's secondary function being to allow the device to charge. You will note that Apple always call the lightning cable, the lightning cable, never the power cable or charge cable, the fact that the cable has a chip in it re-enforces that the cable is intelligent and sends data signals every time it is connected to the iPhone or IPad in order to negotiate the connection, it is therefore not a power cable or charge cable it is a data cable that negotiates the provided services.

As a result, Apple will continue to use the Lightning connector and argue it's case in court with the EU if necessary and it will almost certainly win, because it can prove that the cable is not strictly used for charging.

There's ONE country that really likes the iPhone 5c as well as the 5s

InITForTheMoney

You forget to add Tax to the US price

UK advertised prices always include Value Added Tax of 20%.

US advertised prices do not include Tax, the rate of Tax that must be added to the advertised price at point of sale varies from state to state, Americans know to add state sales tax on to the price they see advertised, Brits don't need to add anything on, as the advertised price already includes all taxes.

Quoting the two prices and pointing out the disparity gives an unfair comparison, since you fail to add GST to the US advertised price, the price paid by a US consumer is higher than the advertised price, the price paid by a UK consumer is not.

If you add 20% to $658 it comes a lot closer to the $807 the UK customer would pay. This means that Apple is making a similar level of profit in both markets (after accounting for daily variances in currency exchange), you cant really blame Apple for making the iPhone more expensive in the UK, the UK Government sets the rate of tax and mandates how prices are displayed and advertised.

In case of LOHAN flight emergency, gobble THIS Iridium-Arduino sandwich

InITForTheMoney

TERROR

Turned Evil, Robot Requires Oblitteration Rapidly

F5 Networks scarfs up LineRate Systems for SDN smarts

InITForTheMoney

More likely will be a separate software appliance on Viprion

It's much more likely that initially LROS would be provisioned as software virtual appliances on Viprion blades with an eventual merger of code bases.

Driverless trucks roam Australian mines

InITForTheMoney

Rise of the Machines

<Insert obligatory SKYNET comment here>

Citrix lowers sword, will take more time on 'Project Avalon' virty PC broker

InITForTheMoney

Cisco ACE

Since Cisco already announed that they are ending development of the ACE product line, with salesmen not being pushed to punt ACE in to new business, it seems that Cisco will now be pushing NetScaler to it's partners. I expect to see lots of Citrix, Cisco, NetApp solutions coming out of the woodwork soon.

NetApp perks up FAS2000, bigs up Vegas gig

InITForTheMoney

The reason they've sold fewer FAS2000 Series...

These models don't support DataONTAP 8 or later, since the CPU's in the current line are not 64bit.

This means that they won't be upgradable to the current release of ONTAP without swapping the storage head for a new model. These devices are effectively obsolete at point of sale.

Consequently if you want to ensure your system has a good software upgrade path, and you were wise enough to ask about it, you were up-sold to a 3000 series that has 64bit processors. Everyone else is looking at a head swap at some point.

The FAS2240 is the first 2000 series that has a 64bit CPU and is able to take advantage of the larger 64bit aggregate sizes and the new features in ONTAP 8. Seems very odd that NetApp have had ONTAP 8 as general availability for ages now, yet not all of their current models support it. Sort it out NetApp.

Defence firm Ultra goes cyber with AEP buy

InITForTheMoney

Wasn't CESG who mandated this...

NPIA had the option of those or Barron McCann X-Kryptor's and chose the AEP's because they have superior key management and the ability to update keys over the wire without having to visit the encryptor or send out the updated key mat to the local force IT team. They actually pay for C&W to manage the encryptors on their behalf.

Forces had the option of either having CONF enclaves (a CONF network in a room) and having all xCJX workstations linked to a single AEP ED20 unit via a local switch, or they could have individual CONF machines dotted around their force networks, in which case they needed an AEP Net Remote encryptor for each individual workstation (Net Remote is a personal VPN encryption device designed for remote workers) to allow them to tunnel back to xCJX over the force network. It was the force decision which they chose, some forces decided that they would do all xCJX access from a bureau service, others that they would have scattered workstations, it depends on the geography of the force.

Police Force IT security is notoriously lax, a lot of the forces I am aware of don't even comply with the CESG guidelines for the RESTRICTED data they hold on their networks, mainly because they have little experience or budget to afford the security devices that are required.

T-Mobile JavaScript comment stripper breaks websites

InITForTheMoney

You can get around this with a firefox add-on

You can get around t-mobiles javascript issues using an add-on for firefox called Modify Headers, download and install the add-on and then add the following two options:

Action Name Value

Add Pragma no-cache

Add Cache-Control no-cache

Then select "Enable All"

This will also stop images from being compressed, I had to do this on my Mac as the Mac version of the Web'n'Walk software that comes with my USB stick doesn't include a facility to disable the compression, unlike the windows version, which does.

All of the mobile networks in the UK implement this compression to varying levels on mobile networks, but T-mobiles does seem most aggressive.

iPhone plunges 13,500 ft from skydiver's pocket - and lives

InITForTheMoney

Likely because it hit the sides of the shaft...

A falling object that experiences a single impact as it hits the ground is likely to fare better than a falling object which bounces off the sides of a solid concrete lift shaft several times. Also there is no mention as to the type of roofing on the factory. A tin or lead lined flat roof for example would probably have some give and would deform easier than the iPhone, likely lessening the force of the impact. A concrete roof on the other hand would deform less easily than the iPhone and probably have dessimated it.

A slanted or curved roof would also have been more forgiving providing a sort of skimming effect for the phone (as opposed to a straight bounce as the heaviest corner of the phone hit the flat roof), allowing more of the energy to be absorbed over a greater surface.

ICT classes in school should be binned – IT biz body

InITForTheMoney

I couldn't agree more...

I'm a 28 year old self employed Infrastructure Architect, I'd like to show you where education gave me a leg up and where it let me down on the way to where I am now.

I started secondary school in 1993 at what was the 13th and last City Technology College ever built, it was brand new and my year group was the first and (to start with) only year group... it gradually opened year by year as we progressed. It took in a cross section of pupils from different backgrounds and eventually became one of the first Academies. It is now the biggest Academy in the Bristol area.

During my first year, we were given touch typing lessons (I've never met anyone else of my generation who was taught to type at school... why?) and basic lessons in Desktop Publishing (using Clarisworks) on Macs, twice a week for 30 minutes (outside of the normal curriculum), at the time we had the largest network of Mac's anywhere in europe. The use of IT was central to every lesson that we had, up to the point where we began GCSE study, at this point the curriculum was more strict and we were only able to use it for certain things. Every student had what we would now consider to be basic skills in DTP, spreadsheets, databases (Filemaker Pro) email and using the internet (via our JANET connection) by the age of 13. At the time this was almost unheard of in the local area.

I took nine GCSE's and a part one GNVQ in Information Technology (by this time we had a suite of Windows NT4.0 PC's as well). The IT subject matter taught in the IT course was woefully inadequate... almost the entire time the class thought it was being taught to suck eggs. Most of us that took it had taken a strong interest in IT (as you might expect given we were surrounded by it) and would help with maintenance of the IT systems and computer rooms around the school, we were expecting to learn things that would be a bit more advanced. The network admin at our school, while not being incredibly strong in IT used us extensively and encouraged us to learn, we would do builds, installs and patches on workstations (Mac & Windows) for her. Some of us also did things with Linux, SunOS and SGI Irix.

Of the group of 48 who took the Part 1 IT GNVQ, 2 actually finished the coursework, not because it was hard, but because it was so useless and boring, we could have proved all of our skills in all of the areas the GNVQ required in the first week by going through work we had produced over the previous 3 years, however, the format of the course insisted that we follow planned exercises in sequence that added up to a lot less than we were capable of. Worst of all... we had to submit all of our completed activities in PRINTED form, not on floppy or CD... printed. You could pass with distinction if you just made it look like it worked, the examiner never got a floppy copy and would never know.

What the course taught was how to use Excel, Word, Powerpoint and how to create a basic database in MS Access 97.

I then went on to A-Level, I'd been told the Advanced GNVQ in IT would be more my thing, since it would teach more advanced IT subject matter. I started it, realised it was going over old ground for at least the first year and that it was going to be equally useless. I quit the course and took an A-Level in computing as an evening class at another College, this again focused on MS Office and Access 2000 (I at least learned some VB this time), but because it wasn't a GNVQ you could be more creative with your coursework and rather than multiple choice questions you could give proper answers for the exams. It was better than the GNVQ but it still covered a lot of old ground and I question the value of most of it.

After VIth form I did a gap year with an organisation called the year in Industry, working in an IT department of an engineering consultancy, where frankly they were amazed that I could type, knew what a CAD package was, could build web pages and knew something of linux. I cant thank everyone at "Scott Wilson" enough for their nurturing attitude while I was with them, without doubt one of the best decisions I ever made was to do my gap year with them.

I then went to university... where what I was taught for the 1st year was effectively the computing A-Level again (I'd used none of it in my gap year), but with some Java (taught very badly), some C (taught slightly less badly), some completely useless maths about matrices and something about logic gates, which I have yet to find any practical use for. However, the university was a Cisco academy and I did learn some useful stuff about IP and routing, which I still use. I dropped out after the first year.

The system fails because:

* Each stage of the education process repeats the last rather than extending it, this means if you start on the track early there is no incentive at the later stages, since you wont learn much new

* All IT courses assume no knowledge and don't allow you to start further along and prove your ability

* IT teachers and university lecturers seem to have very little knowledge of the needs of businesses or the IT industry

* Schools don't have the right equipment or software to teach, they don't adapt with industry and are often left behind the curve

* The system teaches students to use specific products (generally MS Office) this means that when confronted with other packages the students are lost

On the last note, in our case all students knew Clarisworks and the Windows machines were largely neglected until Clarisworks was junked by the school in favour of MSOffice - many found MSOffice confusing, but in the modern world, students learn MS Office and then have trouble with newer versions (e.g office 2007's ribbon interface) or competing packages like OpenOffice.

Air Video 2.4.6

InITForTheMoney

Zumocast is similar and works over 3G networks

I've been using Zumocast for the same thing, also happens to work fine over 3G, it's even quite good and I've streamed video to my iPhone at good quality and without any stuttering.

Zumocast will also allow you to view documents that you have stored on your computer and to download movies, music or documents to the phone from your library or shared folder. Quite a nifty little app for a free service.

Stop sexing up IT and give Civil Servants Macs, says gov tech boss

InITForTheMoney

@AC re: Wrong, wrong, wrong

Actually, your wrong... and I can tell you this because I regularly ask within Industry for solutions that support Macs and when they will be available, believe it or not, I am not a Windows zealot and I am a Mac user (I'm writing this on my own MBP in fact), I've been using and supporting Mac's since 93 and I think I have a pretty good grasp of their capabilities by now, having also worked in Government for the last 7 years I think I also know in which environments I can use them and which I cant. However, unlike some I am more concerned with selecting an appropriate tool to meet requirements, rather than trying to shoehorn something I like in to a function it's not suitable for.

As has been pointed out by another poster, the target of evaluation for PGP full disk encryption does not cover the Mac version, if your accreditor is allowing you to use it, thats up to him if he wants to accept that risk, it probably means that your system has a pretty low risk profile attached and your data is probably mostly ILO-IL2 with only small amounts of IL3 data (if any). Either that, or the accreditor simply doesn't know the TOE for the product and he's mistaken in thinking that it's approved... it wouldn't surprise me.

A good architect will read the TOE for any security enforcing products he or she selects and will document how that product's configuration meets or contravenes the TOE so that the accreditor is informed. If you haven't read the TOE for products you utilise and your an architect in government, then you haven't done your job properly.

Even when used in an evaluated configuration, PGP Full disk encryption is only good enough for _Baseline_ level encryption, not Enhanced Grade or High Grade, which means you can only use it on RESTRICTED (IL3) systems... you don't even need a government disk encryption product for a RESTRICTED system, you can use Vista or Windows 7's Bitlocker feature and be compliant with a tiny piece of configuration... it's hardly much of a bar to reach these days, yet the Mac has nothing that fits the bill and Apple are not concerned about addressing that issue, why should they be? It's not a core market for them.

InITForTheMoney
Megaphone

It's not just about lock down...

One of the key issues with mobile government data assets is that they have to be encrypted with a Government Encryption algorithm while data is at rest, this means that you need a full disk encryption product installed within all laptops and any desktops that might be kept or transported through locations of low physical security.

At the present time, there is no software based full disk encryption solution (supporting these algorithms) that works with Mac's and there is no Hardware based encrypted disk that supports the EFI bootstrap process that Mac's use to launch OSX.

This is the first hurdle that needs to be cleared, without it, if you put protectively marked data obove a certain confidentiality level on to a Mac, then that machine has to be kept in a physically secure location, or shipped around with an armed guard! Even with data at the lowest level it would need to be stored in a locked box that was bolted in to a vehicle if you shipped it around anywhere.

The reason that Mac's aren't used much in government is because there is no way to get around this one simple fact and until there is a company that can put a government algorithm in to an encryption product that supports Mac's, this isn't going to change.

Punters 'pooh-pooh video on demand'

InITForTheMoney

Lack of good interface

The problem with VOD services is that people dont discover new programmes and they dont find out when new episodes or series of programmes they like become available, the reason for this is that they fast forward through the adverts. The majority of VOD service interfaces simply show a list of shows and people browse past shows of interest simply because they dont have a catchy title. What is needed is an advert for each show and for shows to be organised in to categories, you should be able to watch a string of short adverts for shows in succession order to find a new show to watch.

You also need a system of bookmarking shows and a notification that new episodes of selected shows are available to watch, this is what hampers current VOD services that aren't based on recording shows from an Electronic Program Guide schedule.

SGI plunks Windows on big Altix UV supers

InITForTheMoney

VMware / IBM Bricks rival?

If this thing were certified for VMware and pitched at the right price, then it could be a serious rival to IBM's "Bricks" architecture of interconnecting 4 x86 nodes to form a single system image and then carving that image up in to virtual machines, especially as it would allow you to consolidate even large scale-up workloads. Imagine being able to virtualize your entire data centre on to one of these systems occupying just 4 racks, all networking being provided virtually inside of the single system and data being transferred effectively as fast as the NUMA interconnect, you could greatly simplify an IT Infrastructure, reduce networking assets and the number of individual management points. Provided you can get enough bandwidth to some suitable storage, this system could be an absolute killer.

Microsoft embraces ARM with Windows 8

InITForTheMoney

I think you miss the point....

Any software which runs on ARM will need to be specifically compiled for it, it means that all software for these devices will need to be compile with a new development tool or compiler and Microsoft can choose the methods by which software can be delivered to these devices... it would be a dream for Microsoft to start with a clean slate where they can push all Apps for these devices through an App Store like Apple has done. It would mean that Microsoft could:

a) Control the user experience a lot more

b) Take a slice of the revenue of any software purchase

c) Potentially reduce piracy of software

The key issue will be that until there are sufficient devices in the market, it's unlikely that developers will flock to the device, meaning that the majority of software on the devices to start with will be brewed by Microsoft, so these devices will be targetted first at corporates before they are targettted at home users, although you are bound to get basic features consumers can use like a web browser, media player and e-book reader app.

InITForTheMoney

Interesting... does this mean Microsoft will do something like Rosetta?

Back when Apple migrated from the PowerPC Processor line to x86, Apple included a Technology called Rosetta to allow PowerPC binaries to run on x86 chippery, it was based on a technology called QuickTransit developed by Transitive Corporation. SGI used the same tech for Irix Apps when they migrated from MIPS to Itanium. Transitive's Technology was purchased by IBM in 2009 and rolled in to AIX, allowing AIX to run x86 Linux workloads without them needing to be recompiled.

Having already been through one painful CPU Architecture migration, Apples method of migrating (and not alienating it's customers... this time!) was to distribute new software in "Universal" software binaries, these effectively contained binary executable code compiled for both processor architectures within a single installed package, older apps (compiled for PowerPC and not x86) could be emulated by Rosetta on the new machines, new Apps would run natively on both new and old processor architectures.

Interestingly Apple uses the "Universal Binary" method for iPhone and iPod Apps today (some iPad Apps too).

I'm interested to see whether Microsoft may license the QuickTransit technology from IBM or use something similar in order to allow ARM based Windows 8 devices to run Applications developed for earlier versions of Windows, or whether Microsoft might do away with legacy entirely. It's possible we could see a "Universal Binary" install mode in Visual studio, because if that didn't happen then Microsoft would need to make it so that Apps are only available from an online store and just serve up the right release depending on your CPU Architecture, this will be confusing for users if the branding is not handled well as if the user experience of both the ARM and x86 variants is not identical, users will need to understand why software will run on one windows 8 device but not on another.

I for one will watch with interest as the inevitable cock up ensues in 2012 (ha... more like 2014), when Windows 8 is finally released.

Apple patents 'net-booted' OS contraption

InITForTheMoney

More like Sun Ray or boot from SAN technology

Since the OS and Apps run locally this is not like XenDesktop , Windows Terminal Services or Citrix. I've seen Thin clients that boot from the network via PXE and then launch a client for these three types of service, but this particular implementation sounds like an early Sun Ray thin client or a Boot from SAN style deployment. Early Sun Rays booted from an NFS file system hosted on a Sun Ray server. A Boot from SAN deployment attaches a network block storage device (iSCSI or FCAL) and boots locally, typically only with servers though, although there's nothing stopping you from putting a Host Bus Adapter in to a workstation and using it to boot the workstation.

Some servers these days also contain an embedded Hypervisor such as VMWare ESXi or XenServer that can load an operating system from a SAN or NFS volume.

Standard smartphone charger to dominate in two years

InITForTheMoney

I dont understand...

Why they don't just make it so that the charger unit presents a standard USB plug and the device vendor just ships a cable that can be used for both charge and for sync, the cable then plugs in to a USB port on the charger, isn't this what every vendor does anyway? Certainly when I've bought belkin / other third party charges they are a power block with USB socket and 2 cables, one with a mini usb, the other with a dock connector.

This would be better as it's backwards compatible with most existing devices, same wall block for everything, just a different cable depending on your device. I don't see Apple using the mini usb connector, but maybe they will provide an adapter or separate cable.

Ubuntu Wayland: Shuttleworth's post-Mac makeover

InITForTheMoney

Why the single menu is both user friendly and efficient....

Have you ever considered that mouse pointers accelerate based on how quickly you move the mouse or trackpad, so the amount of movement required to move the pointer to a menu attached to a window is actually roughly equal to the amount of movement required to move a cursor to a menu at the top edge of the screen, this is because the cursor is bound by the edge of the screen and will stop over the menu no matter how fast the cursor is moving.

If you have to focus the cursor on to a control in the middle of the screen you have to do this more slowly as you need to slow the cursor down again or even reverse your movement in order to focus over the control, in the middle of the screen you have to do this in 2 dimensions, but at the edge of the screen you only have to do this in one - horizontally (since the screen edge has already stopped the cursor vertically and left it resting handily over the menu bar). Given that words are generally wider than they are tall, it's actually very easy and quick to focus over a text based menu item at the top of the screen.

Even if the application you want to use hasn't got focus, you only have to click somewhere (anywhere) in that large control we call a window in order to give the application focus and the menu the correct context, this requires almost zero precision and takes no time at all. If you take these facts in to account, it's generally quicker to grab a menu item on a Mac than on Windows or Linux.

Other benefits:

* You save a lot of screen real estate by having a single menu that changes based on what application you are using at the time

* The fact that every application has consistently named menu's "<AppName>, File, Edit, Window & Help" means that users can consistently find what they need for any application in an expected location on the screen

* It's better for macros and accessibility software because if you need to pre-program menu movements or screen hot spots for a blind users screen reader software, you know exactly where the menu will be, since it's always in the same place

* Given that a mac is designed to be used with one mouse button by a "consumer" not a "techie", the users expect to use the mouse for everything to do with driving the computer and yes that means going to the menu for EVERY option, even copy and paste for which any windows user has probably learned the key sequence. The mac is designed so that a user can pick up how to use it very quickly through repetition. Clear user interface guidelines for mac apps and consistent placement and terminology in menu's help that learning process.

There are 3 reasons the placement of the menu on the mac hasn't changed since 1989 - it's efficient, it's simple and nobody has yet come up with a better way.

Your comment about multi tasking is absolute rubbish, Macs have been multi tasking since I first used one in 93 and even before that, it wiped the floor with the Windows alternative. Linux was only really in use in academia at that time, had very few apps and wasn't a contender. What the Mac didn't have before OSX was decent resource management and scheduling, but this was no worse than any other desktop operating system of the time and using multiple apps at the same time on the mac was quite user friendly as long as you had enough RAM. As evidence, it would be pretty stupid to have an OS that let you copy and paste a chart in to a word processor if you had to quit the spreadsheet application before you could open the word processor and paste the chart now wouldn't it? You have obviously had extremely limited experience of using Mac's if you think they couldn't multi task before they started to run their own Unix variant.

Death of ID card scheme left £6.5m of kit going begging

InITForTheMoney

ID Card scheme was let in a number of lots...

The ID card scheme was let in a number of lots (5 lots if I recall), each lot to a different supplier and the different lots are largely self contained... only two of the lots were cancelled as they were specific to the issuing of ID cards to UK citizens (the majority being what are called Air side workers - or those staff at airports who work on the runway side of the security barriers), the others have some link to biometric passports or the ability to reading other countries Identity cards, so were still required or seen to be of value.

BT tests 1Gbit/s broadband

InITForTheMoney

C&W also have pipes and their own exchanges

"Be" use the C&W (predominantly former Energis) infrastructure to provide their network, however still use the local exchange premises for termination and BT infrastructure where there is no C&W service. The former Energis network is wrapped around power lines and was spun out of National Grid.

Another player is Global Crossing who's network is run alongside railways (former Racal Network), although this network needs significant investment. This was spun out of the former British Rail.

BT isn't the only company that inherited networks from the national infrastructure during privatisation in the 80's and 90's, it is however the only company that has Points of Presence in all residential areas and (crucially) a wire to every home.

Even Virgin resells BT based ADSL where it's cable network doesn't reach.

Emulex encrypts data before it gets into the array

InITForTheMoney

This would be attractive to government / Military

Boot from SAN with encrypted LUNs would be useful for a variety of in vehicle applications, I have heard of similar things being done with Decru, but using an encrypting HBA within the server or blade makes for a smaller footprint and lower power consumption.

Multi-network iPhone SIM rumours at Apple

InITForTheMoney

May help combat theft...

This could be quite attractive to the operators as it means that the SIM would be presumably non-replaceable and a single entity (Gemalto) would be responsible for blocking of the integrated SIM's, this would make it difficult to resell a handset that had been stolen as it would be blocked from attaching to any network, given that Gemalto knows the Physical SIM within the handset relates directly to a stolen handset, not matter where it is, anywhere in the world.

The inability to connect to a network also prevents activation and therefore makes the device about as much use as a paperweight until returned to the operator or Apple who are the only people who have the facility to ask Gemalto to release the lock out.

It will be interesting to see how iTunes would manage transfer of a Network SIM identity from one handset to another in the event that a handset is swapped out due to fault or theft.

I might also suggest that in the case of a roaming user, the Integrated Gemalto SIM might have the concept of virtual SIM's that could be set up for a roaming user, so that when the user enters another country and the handset identifies it's location using GPS, the handset could automatically use a Network SIM identity that the user has set up for that country and the user gets billed at a local rate by the local network.

VMware's vSphere cleared for military spook servers

InITForTheMoney

Security Target Doesn't include key features

This ia all very well and good, but the Target of Evaluation does not include VMWare VMotion, which means that HA features are unavailable. The Target only includes the running of seperate VM's on a single host, you can have multiple single hosts each with their own VM's but migrating live VM's between hosts and using HA features is not supported by the evaluation. Sure, you can power off the virtual machine, assign it to a different host and power it back up again, this gives you a limited amount of flexibility, but it's still a crippled solution.

Government depts have been using VMWare for ages under this basis, the only difference between this situation and what was previously understood is that the eval now takes in to account that the VM instances are sufficiently isolated fromn each other and the host is sufficiently secure as to alow an EAL4+ evaluation. This in the grand scheme of things, doesn't amount to much more than a small pile of beans, because what gov departments want is the assurance that the HA and VMotion features are secure enough to be deployed in a secure environment, they also want the Security Target and Config guide to tell them how that can be acheived.

HP buys security tools firm ArcSight for $1.5bn

InITForTheMoney

This makes sense for HP, so who will buy netForensics?

HP is huge in government following it's acquisition of EDS, ArcSight are pretty much the go-to shop for Audit and Security Event Management in Government, there are very few products on the market which even come close to touching it. It makes sense for HP to buy products it can easily sell through it's services arm, as products like this tend to involve a lot of services to set them up and get them to work with the vast array of devices that are involved in government networks.

The second best product in this market is netForensics, if I were a betting man I'd put money on Oracle snapping them up shortly.

Why Oracle? IBM has Tivoli suite, which has an auditing component and an event management component, IBM could build SIEM functionality fairly easily from it's existing products (if it hasn't already, I'm a bit out of touch with Tivoli) and it wouldn't need any additional talent to accomplish this. Cisco has MARS, it's not fantastic, but it's a mature product. Dell doesn't have enough software assets or interest in this market to be shopping. CA has Log management products but doesn't appear to have the vision.

So Oracle are the most likely buyer that I can see for nFX as they aren't likely to be bidding against much competition and they don't have a product in their arsenal that fits this space. I would expect nFX to go fairly cheap and Oracle to put in quite a big investment to make it a good contender to ArcSight, the key problem both products have at the moment is scalability, but ArcSight is currently a bit ahead in that area. As both products run atop of an instance of Oracle DB, you'd think it would be something that Oracle could easily sort out with nFX... no?

Ethernet storage protocol choices

InITForTheMoney
Thumb Up

iSCSI HBA's also available

If your after better performance with iSCSI then you can also get iSCSI Host Bus Adapters which will prevent you burning CPU cycles where the workload for the client is heavy, they are also useful for Boot from SAN environments if you really aim to centralise all storage in the SAN.

For services which have a more general purpose storage requirement a software initiator is fine, provided your NIC has a TCP/IP offload engine, otherwise the CPU will be doing a lot of packaging IP packets on a gigabit ethernet connection.

UK.gov awards managed services deal to lucky dozen

InITForTheMoney
Megaphone

Absolutely right!

This is absolutely right and the standard spec for most government departments specifically has no wireless, bluetooth or infra red, mainly because these services are as leaky as a cracked sieve.

Perhaps if the idiot had ordered the laptops via his IT department and actually stated what they would be used for, he might have got what he wanted.

In any case, this article is about SERVICES not hardware and software procurement.

Can replication replace backup?

InITForTheMoney

Data usage and structure not discussed, but very important to choosing a backup solution.

A key thing that appears to have been missed, is about the type of data being stored and how that data is used, for example, if all you ever do is append data, i.e. nothing is ever deleted, then replication is a perfect form of backup, because you never have to roll the system back, so provided that you can ensure consistency between the two copies, all you ever need to do, is add a new entry that is correct to supersede the old information. If you need to see what the data looked like at a particular point in time, then you just ignore all data after that time.

Where using replication is more difficult to justify as a form of backup, is when data is changed or deleted, because the changed data is no longer kept in the data structure and the blocks on which the data was stored could be re-used, this makes it possible to look at the data as it was at a previous point in time.

You can get around this in some cases by partitioning data, an example of this is in compliance environments where you must retain data for X and destroy it by Y. Between X and Y is an amount of time, you halve it and that is your partition period. A partition (once created) is active for the duration of one partition period and data may only be appended to it during the time that it is active, at the end of the period the partition becomes read only and a new active partition is created for new data. Once the newest piece of data in any partition reaches the end of the retention period, the partition and all data within it is destroyed.

This is fine, when you work with systems, where the data structure is designed to have old entries retired, a classic example of this is a financial system, which records a transaction and a running total with each entry. We can see this with an example for a bank statement, when we look at statements, your bank statement is designed to be completely stand alone, for example:

Balance in: £4135.83

Credit - Pay £1982.32

Debit - Electricity £250.12

Debit - Gas £35.00

Debit - Apple Store £599.99

Balance Out: £5233.04

Your bank statement contains the incoming balance, the transactions that took place within the statement period and the outgoing balance. Next month, you'll get a new statement with the new information, if all you care about is how much is in your account, you can shred the old one and it has no repercussions on the new statement, however, if you care about how much you spent on Gas last January, you will keep the statement for as long as that information is relevant.

Financial systems are ideally suited to this kind of process, because bank accounts are not generally related to each other and retention periods are always longer than a year, every account that has not been closed will be updated at least once a year when interest is calculated, this means that there is always a current record in a partition that is within it's retention period and the closure of one account does not affect the status of another.

This is fine... but what happens if you have a relational database and different elements within the database having different retention periods for compliance reasons, a typical example might be a case management system for use in law enforcement, where there will be People (or organisations), Objects (e.g. Vehicles, Weapons), Locations and Events (Phone call, robbery, stabbing) stored within the system and related in any number of ways. The answer is that you cant easily partition this data, so the application has to manage the data's life cycle, retention and disposal and the applications repositories have to have snapshots copies taken at regular points in time in order to ensure that it's possible to go back to a specific piece of information as it looked at that time. Another method is that the application can store a snapshot of elements related to a case within the case record, this helps to insulate against differing retention policies at the expense of storage, it also can make predicting the rate of change difficult, especially if cases become inter-related.

For data that like this that is not linear, you can back the data up to tape as a snapshot, or use a disk based technology and replicate the disk and the snapshots to a second system, but every snapshot technology in the market works on the principle of space reservation, i.e. a certain amount of space is reserved for changed data that pertains to a specific snapshot. If the live volume runs out of space or the rate of change is too great, snapshots will be retired in order to preserve the live data or the most recent snapshot(s), if your business needs to keep a copy from a year ago for any reason and you do not manage the storage system, ensure sufficient space is always maintained and that data change is kept fairly constant or at a predictable rate, it's likely that you will run low on storage and the oldest snapshots will begin to vanish prior to their retention period ending.

What this means is that if the rate of change of your data is massive or unpredictable, then you need to have incredibly large amounts of storage available for the space reservation on both your primary storage and your replica, otherwise your likely to fall out of compliance with your backup retention policy. There are other things to think about if your rate of change spikes. What about the links between the replica's? Will the changes be written to the replica fast enough? If the system is synchronous, will performance of the application be impacted while the system waits for the replica system to confirm the data has been committed?

Replication can do the job, if the volumes and rate of change for data are well understood, or if the data stored has certain characteristics, however, for most cases tape is easier to manage, easier to dispose of and could be more economical to store.

Linux to eclipse Microsoft's 'all-in' tablet enthusiasm

InITForTheMoney

Not necessarily...

Actually, I think you'll find that Apple has quite a long history of changing the architecture and not maintaining backwards compatibility. They only do this as a last resort, as it tends to alienate their customers, but they will do it if it's the more elegant solution.

Examples:

Migration from Motorola 68000 series to PowerPC processors

Firewire in the iPod Dock Connector (this is why some old accessories don't work with new iPods and iPhones)

Migration from PowerPC to Intel (although they did build Rosetta to allow new machines to run old software and brought in the concept of a universal binary so that a software package had 2 different compiled versions of the same app within it, one for PPC the other for Intel)

AppStore - new versions of iPhoneOS / iOS sometimes require apps to be compiled with the latest SDK, this has to be done by the developer and is why a lot of apps crashed on start when iOS4 came out, but of course most apps got updated (where necessary) very quickly by most developers, it's why the SDK comes out before the OS does.

The reason Apple has the advantage with iOS is that Apple controls the development tools, compilers, standards and distribution mechanism, as well as the devices themselves. If they want to change CPU, they make the dev tools compile a universal binary that contains binaries for the old architecture and the new one, it's as easy as that. It's not as easy with open devices, because while there is a lot of consistency, there's also a lot of inconsistency.

Open devices and software may in the end become more prolific, but they will be less profitable, as their perceived value will be less - they are less iconic and have less clear differentiation from other products, the fact they are open means they are easy to copy, if a feature in one device is much lauded it will end up on all the competitors devices very quickly, watering down the value of the first device to implement it.

GCHQ imposes Whitehall iPhone ban

InITForTheMoney

Blackberry in Government

Key points regarding Blackberry and other devices used in Government...

There are two types of Certification that CESG provides for Security Products, the first is Common Criteria and the second is CAPS (CESG Assisted Products Scheme). Common Criteria evaluation can be undertaken by labs in several countries including Germany, the UK and the USA. In general all countries which subscribe to Common Criteria accept each other's certifications as they examine the products in very similar ways. Typically Common Criteria is applied to commercial grade products and commercial grade encryption systems.

CAPS on the other hand is CESG specific and is what CESG uses to assure products which use UK Gov Encryption Algorithms, or secure products that are designed for use in high security environments in government, or which have a large potential for use on secure government networks, in order to pass CAPS most devices have to have some sort of tamper proofing within their enclosure and code and hardware (incl. silicon) needs to follow secure design methodologies. You can find a list of CAPS Approved products on CESG's website ( http://www.cesg.gov.uk/find_a/caps/index.cfm ). Blackberry is not on that list.

Whether a product is evaluated on CAPS or Common Criteria (or both in some cases), the product will not always be examined in it's entirety, sometimes specific features are not evaluated because they enable functionality that would compromise the security of the device, in this case, the target states what features may be utilised and what features should be disabled for a secure configuration. A common example of this is that on firewalls, clustered failover is specifically not evaluated as it breaks the integrity of every type of firewall (that I've come across anyway), you will however find that most organisations that require high availability will implement regardless, despite the feature being documented as insecure, the reason being that denial of service due to firewall failure might be more likely and pose a higher impact than an attacker gaining access to a firewall.

Blackberry's are approved under common criteria and their use in government is based on current rules which allow RESTRICTED Information to be encrypted with some commercial grade products both in transit and at rest, as such they are capable of processing RESTRICTED data assets for email ONLY, this includes the method of receiving email from a BES that is located on a government network and the storage of the email in it's encrypted form on the device, they are not approved for access to RESTRICTED government intranets or for RESTRICTED voice communications. These features are not within the target of evaluation and do not use commercial grade algorithms strong enough to provide adequate protection for the data either at rest (browser history, etc) or in transit (voice by GSM / 3G).

Some agencies have built their own applications on the devices utilising their own methods of encrypting data on the handset and either utilising the inbuilt VPN technology to create tunnels to RESTRICTED government networks, or using the email facility as the method of transferring messages, but these are done by the agencies at their own risk and are not encouraged by CESG, generally these types of applications are thought to be insecure, but where there is an operational need this is sometimes managed by the particular organisation as a necessary risk.

CESG doesn't specifically approve the use of Blackberry's within government, rather they give advice... the advice is generally "We would rather you didn't, but it's your data and you are responsible for it, so if you must do it, we advise that you make sure to do X and under no circumstances do Y, try to use the following handset models if possible as they have the least number of exploitable features, set the devices to auto wipe on tamper, set up remote wipe services, etc".

In terms of Networks, no mobile network is approved by UK Government and no IP VPN based cloud is approved either, some older MPLS fixed wire networks were approved for carriage of data up to and including RESTRICTED as they could be guaranteed to only route within the UK, however these days communications companies are utilising almost exclusively IP-VPN based networks as they are more cost effective, as these networks route by lowest cost, the traffic can go internationally, because of this all UK government data must be protected by appropriate Encryption and the key material has to be regularly updated.

HP agrees to buy Palm for $1.2bn

InITForTheMoney
Megaphone

WebOS on slate?

I think we'll see WebOS on future versions of the slate as well as on phones, it wouldn't surprise me if the slate line, MP3 players, personal media players and the Palm products were all put in the same division with WebOS as the OS for all of those devices, it could even go as far as PVR style appliances or other forms of home entertainment appliances.

HP actually has the cash, the buying power and the manufacturing relationships necessary to be a credible player in these markets, it's actually quite shocking how much HP actually owns, if it bought in a music store and player app like realnetworks or napster it could get a slice of the iTunes and app store success, not the whole cake, but a nice enough slice.

Power monitoring across the desktop estate

InITForTheMoney

Network Components too...

Another aspect often overlooked is the power consumed by network components, which normally stay powered all of the time. Extreme Networks actually have a pretty cool setup by which they power off ports that aren't in use, they also have great power control if you happen to use their devices for Power Over Ethernet, if you have a POE compliant device then the switch can give it exactly the right amount of power, so thin client devices and phones aren't consuming too much.

The switches are also able to take script events, so for example devices could be powered on for a software update or powered off at a certain time of day, or a phone and a PC can be linked so that when the PC is switched on the phone powers up, or so that when a user swipes in to the office door their ethernet ports are activated and the phone powers up.

Sony seeks 'universal console controller' patent

InITForTheMoney

Emulation play?

This could be an emulation play, the Wii allows a lot of game cube games to be played and you can plug in the old controllers, but that involves buying the old controllers, having a controller that was re-configurable, would allow a lot of game companies to bring masses of old games to the PS3/4 very easily, maintaining the original control characteristics and so reduce the time it would take to re-engineer the control mechanisms for each ported game.

iPad forces operators to shave their SIMs

InITForTheMoney

Apple has a history of adopting tech...

People said the same thing when Apple introduced USB (developed by intel) on the original iMac's, which were PPC machines. Apple adopt things because they are a good idea, before the iMac used USB and demanded that all peripherals use USB connections there were no USB peripherals. As soon as the iMac was launched with the only method of giving it additional functionality being USB, there was suddenly a flood of peripherals. About 6 months later the PC world suddenly decided it might be good to actually use these things and build some kind of driver in windows to support these plug and play devices. The same thing happened with Firewire (IEEE1394), which was actually a Sony technology to start with (they branded it i.Link), Macs originally used firewire for storage and DV Camera connections, on PC's you had to buy a card to get the connectivity.

Apple as a technology company likes things to be neat and elegant, if they can make something smaller they will, expect the next iPhone to also use the MicroSIM, as it will provide more space for the battery or other components in that form factor.

Slovakian police chief quits over Dublin explosives run

InITForTheMoney

Not that surprising at Poprad Airport

I've flown in and out of Poprad a few times as I have a mate who lives about 45 minutes from there, it is the tiniest airport I've ever been to, the terminal building is about the length of 3 double decker busses and about one double decker bus wide. The airport terminal building opens for about 4 hours a day, the majority of flights are small aircraft, but twice a week (wednesdays and saturdays) they have two A320 size planes flying in within 30 minutes of each other, both are budget airlines with fast turn arounds, when the planes get to their destinations they get refuelled and fly straight on to some other small airport, the planes don't stay at Poprad more than an hour after landing.

Between the check in desk and the departure gate you wont walk more than 100 yards. When you land at Poprad and get off the flight, it takes precisely 2 minutes to walk to the baggage claim via passport control, the baggage conveyor belt is about two metres long, it goes through the back wall and you can see the guy behind loading it up, the conveyor is so short it might as well not be there, the first time I flew in there I laughed at how ridiculous it was, he sticks the bag on the conveyor and 2 seconds later the guy on the other side takes it off and hands it to you, I've never checked in and out of an airport as fast I have there.

It doesn't surprise me that they missed something because the airport literally has less than 20 staff and those staff move roles as the passengers go to different stages in the airport and all the passengers go to that stage at the same time, the check in closes before both flights land and they close the doors and open some others and when you land you walk through the same corridors of the airport as you do when you check in. I would guess they also have between 2 and 4 sniffer dogs, that is how small this airport is. As the two big flights come in at the same time, the staff are rushed off their feet, you would only need there to be two big problems and the staff would be over stretched.

You can say what you like about the staff missing a piece of explosive at the airport, but you need to put this in context, this isn't on the scale of any airport in the UK, it's much much smaller, if security missed an explosive at Heathrow it would be embarrassing, if security missed an explosive at Poprad, well, it's just playing the odds really.