Dare I ask
How well supported is Thunderbolt in Linux? I would expect it to be better after Intel have made the spec license-free but how good is the support at the moment?
Intel is pushing its Thunderbolt 3.0 USB port as the defacto input mode for future PCs. Chipzilla announced on Wednesday that it will be throwing its weight behind an effort to make the next-gen port standard the biggest thing since USB. Possible applications include solid-state hard drives and 4K video cameras. "Intel's …
Who cares? Nobody except techies use Linux on the desktop anyway. And Linux will catch up - it usually does (sometimes eventually - recurrent nightmares of trying to make in game voice comms work because the audio framework was a huge pile of doodoo just kicked in - this was about 10 years after windows could handle more than one audio channel).
Actually, at the moment, Linux actually has less desktop market share than it did 5 years ago. I wish that wasn't true, but it is.
To answer the OP's question. Thunderbolt 3 works pretty well for me under Ubuntu, other distros may be different and maybe compatibility is mixed, but the devices I've tried have worked.
I know more non-tecchies than tecchies that use Linux (Not claiming that's a general thing,mind, just true in my case). Non-tecchies just want something that works with least hassle, and Linux fits the bill beautifully except for hardcore gamers (and even there things have improved greatly this last few years)
@Don Dumb
I care as well. I should really stop trolling Linux threads with my tongue firmly in my cheek.
Theres certain Linux obsessives and just anti MS bods on here (not you as I remember) should stop trolling Windows threads so aggressively.
I would love Linux to work on the desktop. Its the best solution. But when ever I have tried it its failed. Whenever I have tried Microsoft its worked. And yes I know thats because of market power and OEM support and application developer support and all that shit. MS works with my hardware because hardware makers make Windows drivers. Peripheral providers provide drivers foir Windows. My TV maker supply's a windows app but not Linux.
Its a classic example of monopoly economics.
==
And the Linux community just faffs about (yes aggressive). Apparently you can have a thread on here about which roll of Linux you should go for. Get rid of the choice.
Theres not a which Windows roll? Or a which Mac OS roll? Or which Android roll? Or which DB2 roll? Or which Cisco iOS roll? Or which AIX roll? Or which Oracle roll? Or which . . . you see my point. These successful platforms succeed because they are reliable, uniform, consistent, supported.
==
Lets imagine a world where:
- All Linux devs work on the same OS
- That OS comes in different editions depnding on your needs. The Media edition comes with better tools than Adobe provides. The gamer edition is barebones and high performant. The Office edition comes with . . . . MS Office. The server edition is a barebones install with prompts for what you need to be running.
All these devs supporting multiple distros. Why not one disrtro rule them all. Thats what MS does, what Apple does, what Google does, what Oracle does, what IBM does. And these guys succeed. Its why Red Hat succeed n the server. For Linux to succeed on the desktop it needs to adopt the same policy.
Why doesn't Linux adopt the Microsoft strategy: Embrace, Extend . . . and then give the Extinguish a rest. I am tot\ally talking about desktop here. Linux has won for critical applications in the server space. Unfortunately for Linux, very mnany server apps aren't critical. Linux devs need to embrace and extend that as well.
"I use it everyday on my iMac..."
And what a royal pain in the arse that implementation has been using the mini display port cable 'as the carrier' / slight differences between display port / thunderbolt 2 / thunderbolt 3 implementations across the mac range, over time.
You can't just buy one type of mini "display port / thunderbolt cable" to connect different mac/iMac devices.
Target display mode (best feature of the 27" iMacs, allowing you to use it as a monitor) only works on the 2009/2010 iMac, for display port to display port outputs (not thunderbolt). You can't connect display port to thunderbolt (on newer iMacs), or thunderbolt to display port. There are problems with Thunderbolt 2/3 to Thunderbolt 3/2 in this regard too. You have to buy the specific cable to do the job, DP to DP or TB to TB which looks/fits exactly the same as the other, but different markings/specs.
It's a confusing mess for the end user.
I use FreeBSD and Thunderbolt 2 works on it. I'd imagine that Intel won't have an issue with getting Thunderbolt 3 supported in Windows, Linux, MacOS and at least FreeBSD either. Now the question remains whether anyone use it as far as the manufacturers go.
Intel making it royalty free will probably spur adoption of Thunderbolt for some devices. Blazing fast external SSDs maybe. I really don't think its going to be resounding success Intel are hoping for but I'm sure it will find a place.
Right at the moment its really kind of a pain finding anyone using devices that come with it unless they're using brand new Apple iJunk so honestly Intel's got their work cut out for themselves. Intel does have the money to waste on it though, so maybe they'll force it into products if it isn't successful on its own. Hard to say with them.
Again, who cares? Not because nobody uses Linux, but because I assume Intel making it royalty free would make it a hell of a lot simpler for either chipset makers or a 3rd party to make a decent driver package without having to worry about license implications, which seems to have driven a lot of the closed-source drivers of the past.
Making Thunderbolt royalty free will help with adoption. I've seen very little high-end equipment for thunderbolt and nothing was worth paying extra for. With USB 3.1, there is even fewer reasons for paying extra for thunderbolt. And since thunderbolt 3 uses the same cable as USB type C, how many people would know if it was thunderbolt or regular USB?
I think Intel saw the writing on the wall. AMD EPYC has a very real and legitimate shot of stealing server marketshare. It is very unlikely AMD motherboards would pay a royalty fee for thunderbolt. Without this move, I believe thunderbolt would go down in history as a technology that was nice but that nice. Now it has a chance.
> "Making Thunderbolt royalty free will help with adoption"
Various motherboards have had Thunderbolt on them since Sandy Bridge, either natively (e.g. ASUS P8Z77-V PRO THUNDERBOLT), or via add-in cards (e.g. ASUS ThunderboltEX). What was irksome was the requirement to carry a DisplayPort signal - if you were using an add-in card, this meant looping from the DP output to the DP in on the exterior of the card, which feels and looks something of a kludge (fine if you want it to carry a DP signal, but annoying otherwise).
USB 2 is good enough for most peripherals but USB 3 allows for more power which means it can also drive machines. Of course, this just means that Apple can save money by reducing the number of ports they provide…
To really break through the new port has to be adopted by the phone makers. If I were Intel I'd have dropped draught recommendation by the European Commission to allow charges to be either micro-USB or USB 3…
> USB 2 is good enough for most peripherals but USB 3 allows for more power which means it can also drive machines. Of course, this just means that Apple can save money by reducing the number of ports they provide…
That's a perfect example of the confusion all this produces.
USB 2, USB 3.0 and USB 3.1 are essentially line protocols, at 480Mbps / 5Gbps / 10Gbps respectively. (The latter two also known as USB 3.1 gen1 and USB 3.1 gen2, and also or "SuperSpeed" and "SuperSpeed+", just for additional confusion)
USB A, B and C are connector types; there are many variants of A and B. You can have USB 3.0 or 3.1 on a USB A connector, but you can also have USB 3 on a USB C connector (not sure if it's only 3.1 allowed)
At the other end, the familiar B connectors are USB 2 only. There are special connectors for USB 3 (e.g. "USB 3.0 Micro-B SuperSpeed", often seen in USB 3 external drives)
USB A can only be a power output, and USB B only a power input.
It's the USB C connector which lets Apple reduce the number of ports, rather than USB 3. (A Macbook Pro 2015 has USB 3, but uses USB A connectors).
It's also because the USB C connector can do multiple duty as a Thunderbolt 3 port. Or as a Displayport port. Or HDMI. Or as a power input. Or as a power output.
If you plug your USB C phone into a USB C laptop, it's anybody's guess whether the phone will charge the laptop, or vice versa.
Additionally, there's no way to tell if a port is USB-C or Thunderbolt 3. They both look the same. The current crop of MacBooks has a single USB-C port. It isn't a Thunderbolt port. I mistakenly tried to connect one to a MBA Thunderbolt port yesterday for Target Disk Mode.
Am I the only one to find this rush to replace every single connector with yet another usb-c alternative mode somewhat troubling and confusing?
So far, a USB-C connector may be (possibly a combination of):
1. a variable-voltage power-input point
2. a variable-voltage power-output point
3. good 'ole USB1/2 data port
4. USB3/3.1 data port
5. analog audio port
6. Display Port video connector
7. MHL video connector
8. HDMI video port
9. Thunderbolt general bus port
10. Possibly PCI or Ethernet connector
To make it nicer, it may also NOT be any of the things above - only a subset of 1 and 2, and 3+4 are mandatory; the rest is up to the actual implementation.
To make it more exciting still, a USB-C cable is no longer a passive component and can actively affect the capabilities of the connected devices which can be accessed.
If I have a computing device with an HDMI connector, an HDMI telly, and an HDMI cable, then I am reasonably sure plugging the cable and fiddling with the controls for a while will let me display the picture on the telly, possibly with crappy resolution due to the HDCP copy-protection.
Now if I have these same devices with USB-C connectors and a USB-C cable, I really have no clue anymore. Does the computing device support any form of video output on USB-C? Perhaps. Does the telly support the same video-input format? That's a definite maybe. Is it supported by the cable? Possibly, although it may also decide to fry either or both devices by fiddling with the power-delivery voltages.
Call me old-fashioned, but I much prefer designs which make it physically impossible to assemble a non-working configuration. Sort of like the connectors we had before ...
Like SCSI-1?
That would depend on the flavour of the connector. "Centronics" 2-row 50-pin SCSI-1 connector was a wonderfully robust beast - you can literally lift an entire 50-pound system by a cable attached to it, and have neither come to any harm. Sun's 3-row DB-50 was far less robust, and I won't touch the Apple's square version - far too easy to damage.
However, I definitely do not miss fiddling with terminators.
SCSI-1 was a perfect example of how badly people could fuck things up, and that was just between SCSI and printer ports.
USB-C/TB3 is a bloody nightmare, because the best you can do is trust that the entire chain of vendors all collectively know what they're doing.
Anyone who was born more than 5 minutes ago would know what a good idea (not) that would be.
It's a great idea in concept, but I'm going to wait for reality to come crashing down first before I start depending on USB-C gear.
Like SCSI-1?
(Whistles innocently as he walks off behind the racks of old Sparc-1 pizzaboxes..)
..at least nobody is going to wander off with my box of various SCSI/Ultra-SCSI cables without me noticing. ....or needing a forklift.
Most SCSI problems are due to termination anyway.
1) Never trust device's internal termination - use external terminator if you can
2) Use active terminator instead of passive if you possibly can.
Which is why people still sell monitors with only a D-Sub (aka VGA) connector and (some) things still come with PS/2 ports. Both cheap effective options that just work and courtesy of being around forever you always have plenty of stuff in the back of a drawer that can work with it.
Linux doesn't instantly support bleeding edge hardware, mainly because being maintained the way it is, it's not in bed with the hardware vendors the way MS and Intel are, for example. And when support is added, it usually works fairly well. Before the flames begin, raise your hand if you've been peeved by plugging a mouse or flash drive into a Windows box and waiting up to 5 minutes for Windows to figure out what driver to use, and occasionally still failing, causing you to play "port roulette" and try another port just to make it work. Plug something into Linux, and if it's supported, it comes up nearly instantly with no fuss.
Cable "standards": I don't much care what is agreed on for the next gen ports, but please agree on something and stick with it. And the trend for everything to be ever smaller while fitting more functionality into a cable is disturbing. How many of these wonder cables will you need to replace often because everything is crammed into such a tiny connector that is anything but robust? USB-C hasn't been mainstream that long and we already have devices with flaky ports, and the cables fall out so damn easily. And charging or powering a device with ever smaller wires and connectors when many devices have 2+ amps of draw is getting a bit scary too. Yes it works, and there are safeguards, but not a lot of margin for error.
As a final complaint, what's with the disappearing of all but 1 or 2 ports on machines these days? I understand cost savings and you can only put so many ports on a dinky little tablet, but for those of us that actually USE a machine for more than updating Facebook or occasionally transferring photos, it's anathema. Would you move into a house that only had one electrical outlet in the kitchen? /rant
"Would you move into a house that only had one electrical outlet in the kitchen?"
If you REALLY need additional ports, they would respond, "Get a powered hub!" And to use your kitchen analogy, you would not believe how many places I've seen using multi-plug orange extension cords strewn about the place...yet they STILL pass inspection.
@GingerOne
It's an easy mistake to make, because in the past, Intel would give Apple exclusive rights to the tech for the first few months before letting anyone else use it. Thunderbolt has always been an Intel standard, and actually an excellent standard, but the amount of money that Intel tried to extort from people basically guaranteed that nobody other THAN Apple would use it.
USB-C is the next-gen USB. Thunderbolt is essentially an external PCI-E bus, so it can handle essentially any sub-protocol you want, whether it's displayport, HDMI, USB, Ethernet, X-modem, or whatever.
There are a couple of nice overviews from 2014-2015 here:
http://www.tomshardware.co.uk/usb-31-usb-type-c-refresher,news-51106.html
https://www.cnet.com/how-to/usb-type-c-thunderbolt-3-one-cable-to-connect-them-all/
And some of the pain this causes, especially around different but identical-looking cables:
http://blog.fosketts.net/2016/10/29/total-nightmare-usb-c-thunderbolt-3/
There is so much confusion here, and I think Intel are specifically trying to do this! "Thunderbolt USB" my arse. USB is a standard - they should not be allowed to misappropriate it in that way (even if Thunderbolt can carry USB signals over it).
I for one haven't used Intel chippery in years, so no way I could have Thunderbolt, but I have plenty of USB ports that are all fine.
Also - the letters in USB are for the end connecters and the nubers are for the generation - so we are acutally talking about USB 3 compatibility (i.e. the newest and fastest) with USB C cables (i.e. the new multi-directional connectors).
Just my 2 pennies worth