GPIB?!?! AHHHHHHHH!!!!
GPIB? KILL IT WITH FIRE!
Seriously: Nuke the entire site from orbit, it's the only way to be sure!
Let the 1980's have their technology, for the sake of all sanity! If you must do this, use LXI.
Mac applications developer Panic has found something interesting inside an Apple video adapter: a computer. While trying to figure out why video output from some iDevices was so poor, the company cracked open a Lightning AV Adapter, a $US49 accessory that is sold as allowing Apple devices to send video to HDMI devices in …
It's still in quite widespread use if you look in the right places. While mass-produced stuff has generally abandoned it even in its traditional test equipment heartland it retains the advantage of being very easy to implement - it's the only bus that can transmit at those kinds of speeds while being built on a generic stripboard for example. If the device in question is a unique one-off that's a very strong attraction, since the costs of PCB design and fabrication can't be amortised over a large number of units. The relatively simple protocol (especially by multidrop standards) at both ends of the connection helps massively with those short-run costs too.
I work in just that kind of short-run embedded engineering shop. Several times I have sat down at 9am with a list of requirements and gone home at 5pm having designed a circuit, written the firmware and host software, built the thing and tested it. That's doable with GPIB. The next preferred option would be 10base-T, but that means careful impedance matching, PCB layout, a TCP/IP stack on the device, more sophisticated host software, etc etc. The unit costs will end up lower but you have to offset those against the design costs: that's generally at least a day's work and if the production run is three or four easier is almost always cheaper.
Careful, it sounds suspiciously like you know what your talking about..... I used to nurse an old 486dx 33MHz with ISA GPIB card along until about 2004, when it finally gave up the ghost. Only thing that would talk to a spectrum analyser for testing EMC emissions. Looked for a USB to GPIB interface widget back after P.C failed, but don't remember finding one.
Happy daze......
In a previous job in the late '90s, whilst 'decommisioning' an old Mainframe I came across several boxes of GPIB leads, (yes, the Mainframe had a GPIB interface card) was told to bin them with the rest of the garbage, but passed them on to a research group who I knew had a fair amount of test gear with GPIB ports. I was later told by one of the guys there that a rough estimate on the cost of purchasing the leads I'd given them was in the order of £4,500.
My next job I had the delights of administering a Linux box connected to a £750,000 'device' via our old friend GPIB, and, AFAIK, that beastie is still in use.
@OP - the point is that there is a single port on the iDevice, which is used for multiple functions, not just video out. one port, multiple adapters. not multiple ports, each with a separate adapter each.
you're obviously content to have a device with multiple ports, which you may or may not need. iDevices have a single port, which you at least need for charging and possibly for syncing. if you want to do anything else with it, then you buy an adapter, but not everyone needs video out. not everyone needs a GPIB interface. So the few that want a video out via cable have to pay a bit extra. the rest of us stream to our AppleTVs to get video on the telly, without having to pay even the $5 for the hdmi adapter you mention.
I've got iNews for you jai, Androids have one port that they use for multiple functions as well. Charging, USB connection and HDMI output... all on the same connector without needing an expensive cable that has a computer inside it. Now what was the benefit of all that extra iMoney you spent?
Having created a smaller port, they find they can't send 1080p video over it. So:
a) outputting from an iPad to a TV means horrible video quality
b) You can't use an old dock-to-hdmi and a new lightning-to-old dock adapter together, even though some Apple stores have claimed you can.
c) expensive adapters
I actually like Lightning on my iPad 4, and this isn't relevant to me since I don't connect it to a TV, but I bet this means Lightning has a very short lifespan-there's no way it'll be able to do 4K when that arrives.
Yeah, but it quite clearly isn't 10gbps if it can't even stream 1080p HDMI, which is under 5gbps.
Indeed it looks like it can't even stream 1080p H.264 (although this might be an issue with the iDevice's video encoder rather than Lightning itself).
Clearly Apple's use case is iDevice -> TV via an AppleTV over AirPlay. How dare you deviate from that.
".. which is why you'd simply push it wireless to an Apple TV. Works. Using AirParrot we do the same with Windows laptops.."
Hold on, so instead of carrying around my Galaxy note and HDMI adaptor (£5 on Amazon), I have to carry around an Apple TV as well as an ipad?
Hold on, so instead of carrying around my Galaxy note and HDMI adaptor (£5 on Amazon), I have to carry around an Apple TV as well as an ipad?
:)
No, you're correct in that that would be useless for portable applications (my oversight, I tend to use a "normal" laptop for that which has a boring VGA adaptor). It's more for the reverse - we have a fairly large screen in our office which has become the major means of presentation since we stuck an old v2 Apple TV on it that someone had lying around gathering dust. Might as well give it something useful to do..
I can explain the theory. By reducing what the connector does to 'a serial bidirectional stream of data' you turn all possible external connectors into mere software extensions. Whatever you want to output must already be data within the device, so you export that data over the connector and let the cable worry about reformatting it.
In this case that appears likely to have put some sort of video codec into the loop between device and cable and leaves the cable having to decompress video and run a framebuffer.
So what Apple has done, in contrast to the single-port Android phones, is made no concessions whatsoever to the two or three cables it's pretty obvious most people are going to use in real life right now. I guess the calculation was that the chips that have to go into the cables are going to be very cheap very soon and they need a connector that they can stick with for ten or more years in order to ensure accessory lock-in. There's probably also an argument that they've overreacted to the old connector having long disused pins for Firewire, still having what will very soon be obsolete pins for analogue video, etc, etc.
"Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. "
No, probably not with the adapter. You screwed the consumer when you made the lightning connector (or whatever hipster name you came up for it). It's a crappy connector (to hard to remove because it's way too tight and the part you hold is too small). It's not compatible with accessories (I know, there's an adapter for that, too, but how do all these adapters jive with the magical design philosophy?).
But the real screwing is that there's a perfectly awesome, widely used connector out there: the micro usb. I have dozens of those cables and chargers around, because all my devices use it. Some firm who never innovates, just copies (sam sung what?), even shoots HDMI over it.
Paris, because she loves what you did to her.
lightning is WAY better than micro-USB. A good connector OUGHT to be tight. And because of the nature of the uses put to it, often is the main thing holding devices into their cradles. And it works both orientations. Micro USB is extremely fiddly, hard to slot in, and has no holding power. As for speed, lightning is basically USB, so no pros or cons as far as that. Probably lightning will become USB3 at some point.
Yeah, micro-USB is a standard, it has that advantage. But even ignoring Apple's desire to keep it proprietary, its a way better connector for the situation.
This post has been deleted by its author
More likely the other way round - the text is clearly written in response to something else, and it fits quite well as a follow up to the Panic post. It is posted in isolation on Slashdot without any context making the whole message seem a little out of left field. It would also be far from the first time someone has bulk copy and pasted something to Slashdot with nothing in the way of attribution.
I've no idea as to the respective time zones for those timestamps but is seems clear which was first in reality.
Wrong. The firmware is held on the iPhone or iPad, when plugging in the adaptor the iDevice uploads the firmware into the adaptors RAM and it starts working.
It's actually a very neat solution to a problem, but one that probably shouldn't have existed but I guess some people are still stuck in the cable era.
There are many other devices that used to do this, USB ADSL adaptors for instance.
And the great thing for Apple is that when the put out the iOS N.(M+1) "update" then everyone rushes to download it onto their iDevices and they immediately update the firmware on the lightening adaptors which will now respond to the iDevice with the correct response and everything is all fine and dandy in the reassuringly expensive iWorld .... of course, if anyone has broken ranks and bought a cheap third-party cable from ebay then that will probably have the initial handshake with the iDevice hardwired into it and won't be able to be updated to the new version so those users will find they can't connect anymore.
I'm half asleep but if I'm reading this right, you're saying there's going to be an update released for a cable?
Do you take it down the Apple shop and plonk it in a special update socket or something to do that?
Please dont tell PC World about this, they'll start claiming SCART sockets have computers inside them in order to justify flogging them for three times as much as a crappy old freeview box to pensioners with old tellies!
Though on the other side of the argument it allows Apple to say that they can control what is connected to their iDevices .... as a user this seems to be a bad idea but for the content providers it may be seen as positive and may be a partial explanation why iPhone/iPad seems to get apps for media streaming first.
The part you are missing is that, pay attention now, IT IS NOT A VIDEO PORT. Did you get it? It is a generic, multi-purpose, serial link providing raw information.
This significantly simplifies the internal design of the device, lowering it's cost and failure potential. One single port, as opposed to myriad dedicated sockets.
The thing to keep in mind is that not everyone will require wired video output (indeed, most within the Apple ecosystem will just stream wirelessly). They will be spared the expense and complexity of having a dedicated port for it.
Those that need such a thing can buy an adaptor that implements the necessary transcoding of the signal. Being controlled by software means that it can easily be upgraded and improved.
It is a rather clever and elegant solution to future-proofing the device.
dZ.
Can you please help me find the dedicated and complex video, charging and data ports on my Galaxy s2? I've asked around at work but since none of use are Apple shills we're having trouble finding anything except the microUSB/MHL connector.
Future proofing: to add complexity to ensure a device cannot be used in the future?
I get your point and agree with you, the idea of a generic bus with specific adapters is a reasonable idea, but future proof? It's barely current proof.
I really don't see the tragedy in having 2 ports rather than one, but if it is THAT distressing and adds so much extra cost to a device with a >30% profit margin then by all means use a single port, but ffs get it right, if you are going to reinvent the wheel, make an effort to at least make sure it rolls.
> The part you are missing is that, pay attention now, IT IS NOT A VIDEO PORT. Did you get it? It is a generic, multi-purpose, serial link providing raw information.
Like USB? Except not standard.
They could have just put on a standard port and not re-invented the wheel, defied the EU, or screwed over their customers. They could even have left the old connector on the new phones.
You know it's bad when Archos manages to do something better than Apple.
A cable virus! Load it into the cable and it back-hacks the iDevice.
I believe that there was some concern about FireWire some time back, and some speculation that Lightening may be vulnerable in the same way through RDMA. Anybody remember whether these fears were proved groundless?
Mind you, as the software had to be loaded from the iDevice in the first place, you you would need to get it past Apples App. police.
Well, as it seems perfectly possible to get some mug to "refund" a payment made with a rubber cheque, that should be the simple bit.
If I^Ha hacker were really clever I^Hthey would set up an enterprise flogging Pi's at 150 quid a pop first and win on both the swings and the roundabouts.
They spend all their time working on their "image" and here's the best irony I've ever seen for Apple!!
It seems to me that their whole "owning an Apple product makes you cool" and "it just works" philosophies are now gone.
I've been telling the idiots round the office this for years but do they listen!!
Even Microsoft wouldn't get something this fundamentally wrong and everyone still holds the 80's against them for being THAT wrong!
Ditch Apple guys ... and do it fast, we owe Steve Jobs that much at least!
...it would be seen as the holy-grail to all cabling problems. Because Apple have their gritty hands on the technology, they're seen as evil and this solution shouldn't exist. If one cable could have different personalities (like this one), bring it on. You could in theory add an extra port to an iDevice and have a backup if one port decides to die on you.
Just to round up, anything pro-Apple on the reg is an automatic down vote.
quote: "If Samsung did this...
...it would be seen as the holy-grail to all cabling problems."
Samsung already do do a limited version of this. There is 1 (one) headphone socket and 1 (one) micro USB / MHL socket on the Galaxy S3 I am currently looking at. Since iDevices usually have a headphone socket on them, there is in total exactly the same number of ports.
The main difference is that microUSB and MHL are standards (although in the case of the S3, not-quite MHL pin standard), but only deal with 3 things; charging, data transfer (USB bus) and video transfer (MHL). Note that these only require passive cables, which makes it cheap for the end user. Lightning is a generic data bus, so you will need active adapters (i.e. a cable with a computer in it) but gives as many options for output as, err... there are options for output.
The fact that they are selling it as a single cable, rather than a box with a multitude of output cables, is slightly more telling. If it were me I'd be marketing it as the one-size-fits-all solution and have HDMI, Ethernet, USB, eSATA, RS232 and any other serial output already on it (or as plugin cables to a proprietary connector on the box, if that is more your thing), with the caveat that apps will need to have the firmware available to be able to use it. Having an ARM-based active adapter for a single use-case seems like it's being artificially limited; if you are already adding a computer with infintely updateable "firmware" (it's more like a software download every time you connect it) then why only have one type of output hardwired in? To make people have to buy a second cable for a second output type?
If Apple had used a standard like MHL but created a proprietary connector for it, all the Apple haters would be out in force. But here we see the Apple haters praising Samsung. Double standard much?
They reduce the cost of manufacturing iDevices and simultaneously prevent unlicensed adapter cables from working. They can presumably fuck up unlicensed add-ons that do work by releasing firmware/OS updates that mess with the Lightning port's protocol.
That said, it is an elegant solution to the problem of interfacing devices, and I do like the idea of an any-way-round connector (micro-USB is a bit small for a keyed connector, too easy to break); but better implemented by creating an industry standard and minimal cross-licensing to encourage adoption I think.
How is this really any different to plugging in a USB module to deliver some functionality not included on the device itself?
One can argue whether this is the best approach or not, over how well a particular module functions, but the principle is entirely sound.
I'm going to have to add an external adapter to get my VGA monitor connected to a Raspberry Pi so find it a little hard to criticise Apple in this respect and I don't own a single Apple product before anyone accuses me of fanboism.
> How is this really any different to plugging in a USB module to deliver some functionality not included on the device itself?
It's only compatible with Apple products.
I can use an OTG cable on an Android device and use it to swipe the USB NIC that's hooked to the back of my Mac. Or I could plug in a USB hub and really go to town.
An Apple only version of a common standard interface sounds like nonsense out of the 80s.
Didn't HP get spanked for non competitive practices doing the same thing to Ink Cartridges for Inkjet printers?
Seems pretty low down and nasty to prevent aftermarket competition on simple cables.
Now I could see a use for an intelligent "Universal Serial/Parallel Adapter" cable that just knew what it was connected to and made the translation.