It opened the wrong way
Necessitating that all apps be customized to the platform to not get split in half or only work on half the screen.
Why would you do that, I asked at the time.
It seems no one did...
Microsoft has confirmed that Windows 10X, its somewhat streamlined operating system initially designed for a new wave of dual-screen mobile PCs, is being killed off. It appears the Redmond giant doesn't want the OS's features to languish on whatever dual-screen slabtops are or become available, and so it will put some of its …
No they probably don't. There are so many aural brevity codes to choose from that different organisations could not communicate with each other with full clarity when Hurricane Katrina blew in. This prompted FEMA to nerf the 10 codes and replace the critical ones with one or two words, that was in 2005??
Once you get away from the 10 codes there are also so many others to pick from https://en.wikipedia.org/wiki/Brevity_code
heh, you motivated me me to look those up.
("that's a big 10-4, good buddy" - that song by C.W. McCall)
I love codes like this, although when I had my ham licence the Q codes where my favourites*. Anyone complains about text speak should try learning Morse. The endless stream of shorthand there is absolutely mind boggling.
* Not that I can remember them now, of course. It was many years ago, and I've since moved country so I'd have to redo all the exams again if I want a licence again, and that feels too much like hard work to me.
No, I think Danny Boyd has the right of it in terms of engineering concept. Windows treats multiple displays as one extra wide display. Here rather than having the OS treat the twin screens as an extra wide screen, have the OS treat the two screens as one above the other for the UI. It's not that hard to have two or more touch-screens although I have no idea how Windows (probably barf) would react to each of them together. You'd have the on-screen keyboard always defaulting to the bottom screen. I'm just surprised that the lot at Microsoft are having to strain so hard here.
I'm beginning to think that there are some serious hard-coded choices in the UI software architecture inside Windows that are blocking what should be something rather easy to do if Window's internals weren't so brittle. I've never gone that deep in the UI as it's not something I need here. Brute force display defaults have always been good enough.
It doesn't really treat multiple displays as one extra wide display, unless you are referring to the windowing coordinate system? Other than that, each display is pretty much separate and the "clever" happens when the OS has to map a window across multiple displays - that's when things get quite interesting as it's effectively multiple windows which have to share the same identifier despite being separately rendered. It's all down to the compositor of course, and it's likely why Microsoft decided to ditch the hardware accelleration as while good, was going to cause problems if the hardware for the displays was different - which was a very easy use case.
Windows doesn't treat multiple displays as a single wide monitor at all. It never has. It treats each additional monitor as a separate, independent display that is positioned relative to the primary monitor. (Which is why it's relatively easy to split a physically ultra-wide monitor into several virtual displays that Windows sees as separate entities.)
The *desktop* is treated as a unified space within those monitors (except when the desktop is mirrored to all displays, of course), with the option to have the taskbar visible on all or just one monitor.
Applications generally work within the context of a single desktop that is a single connected space, but (usually) maximise to the monitor they occupy the most. In fact the only thing that an application can't do at the moment is maximise a single window across multiple displays (as far as I am aware), and not many applications have the ability to occupy multiple monitors with different data displayed in each - but that's an application defect, nothing to do with the OS.
If you run at least some implementations of the x server in Windows, it will treat multiple displays as one extra-wide display, and that is far from a great user experience.
Windows does not do that, which is the right choice most of the time when you have actual separate monitors, but where you have for example an older 8k display which presents itself as 4 x 4k displays each fed with its own hdmi cable; or where you want to create a video wall with maybe 4 or 9 thin-bezel display panels, then it does become a problem.
It absolutely doesn't treat them as the same screen, because you can do weird things such as mix and match colour depths.
Of course far too many programs only ever look at the primary display device details, and don't offer the option to specify (on full screen apps) which display is used, but the option is there
decades ago, when Windows 2k was the newest, I experimented with multiple desktops. I wanted to see if I could have applications running on one that were separate from the other. I discovered that SEAT COUNTING was behind this - you could not get the start menu to run properly on the second desktop without having multi-seat [like terminal server, basically].
Otherwise, you could run applications there if they were "aware" enough to open up on the other desktop. But it wasn't very useful because of what I just described...
I don't know how Windows 10 manages multiple desktops now [probably some 'soft' way that hides some windows and makes others visible]. In theory, though, you should be able to have one set of desktops for one monitor, and another set for another monitor. That capability has been in the NT kernel for a LONG time. NT 4 had it.
BUT... with the way they handle seat licensing, it's effectively "brittle".
>Nah mate, when we say dual-screen laptop or slabtop, we mean a laptop that has 2 screens and folds up, like two touchscreen tablets hinged together --- not a multiple monitor PC.
A laptop with two screens is just a single instance of a multiple monitor PC with a predetermined functionality mapping - something the third-party multiple monitor utilities have been able to do for years on Windows/Linux.
Not sure what the relevance of being able to fold up is, unless you are referring to the screen switch that occurs on some devices with both internal and external screens (internal screen is hidden when device is folded). Given how clunky Windows has from the outset handled screen rotation (landscape->portrait) and the switching of external screens/projectors, I suspect any capability to handle any of the more useful multiple screen implementations on modern mobile devices is probably best left to third-parties and/or OS's..
There is a huge range of functionality that a genuine multi-display box could offer rather than simply to provide more sceen real estate which you have to arrange manually all the time. There have been some valiant attempts to provide software to offer such features but generally these fail as they cannot keep up with changes in the underlying os.
>There have been some valiant attempts to provide software to offer such features but generally these fail as they cannot keep up with changes in the underlying os.
That has been MS's intent since the early days, they don't want third-parties to enhance the Windows experience - remember the third-party alternatives to Win3 Program Manager, they much rather you suffered the more restricted and clunky MS bundled functionality.
When I first started using CodeView for Windows I had to have a 2nd (monochrome) monitor. I always thought it would be cool if I could somehow use that with regular windows applications.
So it made the case for dual monitors with separate function, for debugging at least. Until it didn't (MSVC).
Even when I have use of the system console, I have been plugging at least one "dumb terminal" into a serial port and throwing it a login since the early BSD days at Berkeley. I still do this with modern BSD and Linux systems. Handy for all kinds of things. For example, it's nice to have a friendly login prompt if/when the GUI goes TITSUP[0]. I do most of my serious writing on it as there are fewer distractions with a CLI. It's also a handy place to send stderr when debugging. Etc. Recommended.
[0] Total Inability To Show the Usual Pr0nictures
"Microsoft had promised fast system updates, improved security, and other features primarily for dual-screen tablet-ish devices."se
Why should the 99% of Windows 10 users who were not using a dual screen tablet be running a slower to update and less secure version of windows because they only had one screen?
So it does make sense to kill 10X and to implement these features into the main Win 10 OS. Unless of course they never really existed outside of marketing BS and in reality they were too hard to implement without breaking compatibility with older Windows programs?
Maybe peak Microsoft, but secure it wasn't. At the time a 56K modem was the best internet connection most people could hope for, and at that speed downloading the critical security patches post-installation took longer than the average time to being pwned while connected to the net.
-A.
I heard that a lot back in the day, but empirical evidence suggests it was hyperbole at best, and deliberate FUD at worst. I personally installed thousands (possibly tens of thousands) of copies on home and small office computers in and around Silly Con Valley with no problems.
Besides, most folks who had the sense to install Win2K at home knew to patch the system before plugging it into the net. Multi-use (no license required) CD images of the current Microsoft updates were readily available ... For example, Fry's Electronics had them available for free on CD, no network connection required ... And of course, installations at work were always patched through the local patch server prior to being allowed out on TehIntraWebTubes at large.
Note that I hate Redmond just as much as the next geek/nerd, but how about we stop promulgating the myths? The truth is bad enough to damn them without re-inventing history.
Faux, failed exec marketing speak is generally an alarm bell. People who write like this have only duplicitous motives in mind!
In my experience, directors want to know who, what, why, when and how get results. Folks resorting to fluff - meh.
That'll be about 9/10th of Microsoft, then.
> you try to move a window between them and then one of the parts on two screens is hidden
Your Apple bashing seems to based on gibberish. What does "one of the parts on two screens" mean? Even Windows doesn't allow overlapping screens - which is the only way something could appear on both (mirroring aside).
MacOS, capital M, refers to the classic Mac OS of the 1980s and 90s. macOS, lower case m, is what Apple is calling OS X and OS 11 now. The OP stated ‘MacOS’. He was incorrect.
Even if he actually meant macOS, he’s still wrong; the multiple monitor Mac Pro across the way from me right now has no problems displaying windows which reach over two monitors.
Capitalisation notwithstanding, "Mac OS" (with a space) was the predecessor to the current macOS which was originally named Mac OS X. ;) Which is, of course, the version I meant. :)
And I didn't say it didn't work and I never said it had problems splitting windows across two monitors (that was someone else), I just said Windows is better. And it is. Buy an ultra widescreen monitor and try and get macOS to treat it as two (or more) virtual displays. It simply can't be done.
I used to really like using Macs. That was back in the time of System 6 and 7. They were great - providing you hit a version number that worked. Windows 95 came out a six months after I stopped using them and it was definitely not as good. However... since then, Apple have severely dropped the ball when it comes to usability and functionality. Windows has far surpassed anything Apple ever had in my opinion. I would argue that is, in part, due to Apple's decision to switch to a Unix-derivative OS. It's an antiquated architecture and API that is being kept alive with hopes and prayers a lot of the time as much as anything else. But it's also due to Apple losing a lot of their original designers and engineers who really understood the basic principles of interaction design and how to apply them. MacOS these days is clunky and awkward when compared to what it used to be - doubly so when its Unix roots are showing and you have to dive into a shell to perform what should be simple operations.
The dual screen devices were never a sane option. They were a hack job created by some techie or engineer who thought it would be "cool" but had no concept of market acceptance or viability. People who use cell phones and lap tops are not interested in "bulky", and you can't double the screen space without making a device bigger. It may be a clamshell device to save on the space while moving the device, but that won't help when using it.
I have a dual-screen desktop, but I rarely even turn on the second monitor; it is an old 1080p that I use specifically for ensuring tested apps are running on a normal user's "native" resolution. My "real" monitor is much bigger, newer, and nicer.
To be honest, I've never seen anyone BUT a developer or an engineer use a dual-screen setup. Even gamers that get into multi-monitor displays are usually techies for a career...
To be honest, I've never seen anyone BUT a developer or an engineer use a dual-screen setup. Even gamers that get into multi-monitor displays are usually techies for a career...
Then you need to look in a vaguely modern office where it's becoming pretty much standard to have at least two monitors on each desk. Combine these with a laptop and a user can either have three displays (one not so optimal for use) or close the lid and have two displays. All very simple and once a user has used multiple displays it's very hard to switch back to a single display.
>All very simple and once a user has used multiple displays it's very hard to switch back to a single display.
Actually, it was very easy to switch back...
As a long-term mobile user (the Compaq Portable was my first mobile computer), I did experiment with external displays but soon gave up due to lack of consistency etc.:
Use laptop on train in the morning - no external monitor
Use laptop at hotdesk in base office - some had external monitors but no consistency in resolution.
Use laptop at station/airport/in airplane/on train ie. whilst on the move - no external monitor
Use laptop at client - typically no external monitor
Use laptop at home - got external monitor but that means only using the laptop at a specific desk...
Easier, just to get a laptop with a decent screen and use that most of the time. Now I use an external monitor for specific tasks where I have some control over both where I perform said task and the availability of hardware.
However, having just put together a system with 4 x 4K (43") displays, I do appreciate having lots of screen space on large screens is quite nice, and wish my laptop could at times provide similar in a more portable form...
However, having just put together a system with 4 x 4K (43") displays, I do appreciate having lots of screen space on large screens is quite nice, and wish my laptop could at times provide similar in a more portable form...
I think you've just proved the point!
Yes, when mobile with a laptop we can't expect multiple monitors and therefore we have to make do with just one. It's annoying though...
Yep.
There was an edict of 'one monitor per computer' made by the powers that be at [RedactedCo] because a LOT of people who didn't need dual monitors for their job functions were wanting one because it looked cool; This led to a LOT of anguish for the graphic designers, who used the hell out of their multiple screens for things like video editing and content testing, and in certain parts of the it department because it effectively cut our productivity in half.
Marketing got around it by ordering gargantuan monitors, the parts of the IT department got around it by switching to laptops and using the laptop screen open and an external monitor. a few of us also have a second computer for things like group policy testing and having a jump box to remote into for certain operations, so there's a second monitor on those desks as well...
I'll largely agree; being limited to just the laptop screen is rather annoying, but usually I don't care because if I'm not docked, I'm usually somewhere troubleshooting something with the laptop...
Can confirm, have well over 100 dual-screen setups in the company I work (and quite a few triples) and we don't have any engineers or developers, just office people and medical techs! For most any situation which requires multiple windows to be open, having multiple displays is a huge productivity booster. I absolutely hate using my laptop not because it's bad, but because it only has one display!!
Errm… around here, people who use multiple monitors include:
* The Art Dept. They do things like sticking toolbars and such on one smaller monitor while doing actual work on the other.
* Accounting. They have monster spreadsheets which won’t fit onto one monitor unless it’s huge, and sometimes not even then.
* IT. Multiple monitors for remote ops, for server ops, the servers themselves may be headless, but server admin will have multiple monitors the better to keep straight which server they’re working with.
* Admin. Copying and pasting in multiple documents becomes much easier with two or more monitors.
Yes, they could use just one monitor, but that would hurt productivity. Monitors, even good monitors, are not as expensive as they used to be. Having space for them is now the critical problem.
We saw one of our competitors doing something that looked cool so we thought we'd throw lots of cash at what was essentially a vanity project and, being Microsoft, tried to shoehorn Windows into a use-case that it was fundamentally unsuitable for.
Untill someone with slightly more sense canned the whole thing.
Oh well.
Consider yourself lucky to have not encountered the massive WinHype around the cloud of fast-evaporating vapor that was WinFS. IIRC it lived around the time of Vista, and trying to put it into Vista and then yanking it helped make Vista late and awful. It was supposed to be an all singing, all dancing, super file system. And then it suddenly wasn’t.
... That Windows 8, at least in its .1 incarnation, is still in support (of a kind), and so a viable alternative to the to-be-avoided-at-all-costs Windows 10 abomination. For a short while, at least.
Put Classic Shell on it, and don't move your mouse too near the right edge of your (primary?) monitor, and it's almost usable.
-A.