Om a side note....
.... application icons are returning to be less flat and more colorful.... hope the rest of Windows 10 will follow soon.
8313 posts • joined 28 Feb 2010
The problem with part-time job is they tend to be or menial tasks, or very high-level ones - which may be your case.
Or maybe something you can do aside like documentation, translating, graphic elements.
Otherwise it may be difficult to build a team where part of it may not be available when you need it, and there's always the need to ensure someone else is fully aware of what that part is doing and fill-in anytime it is needed. You may also wonder how much someone is committed to a given project and evaluate the risk they may give up in the middle of one.
So yes, you may find some offers, but they are not as common as you may wish.
In some ways open source did undercut jobs for full-time workers - the more code you can borrow from somewhere, the less skilled people you need to employ.
Not everybody has positions for someone who wishes just to fill 30% of it. Some tasks may be sometimes assigned this way, but often they can't - simply.
Moreover if companies start to fill roles in a ubereque way, people will find themselves continuously begging for jobs at reduced wages - with those having other revenue streams and working as a past time advantaged.
Then we will stop about looking at slavery in the past, and will look at the one in the future.
Here the underlying problem is the continuous switching of developing tools and frameworks which evidently somebody with enough power is always enable to force at MS, despite the serial failures.
It didn't impact Windows only because most developers understood following MS marketing is just suicide and keep on developing Win32 (or .NET) application ignoring the latest shiny-shiny idea from MS.
Let's see what will happen as the same people are now trying to kill Win32 and force developers towards new frameworks that I'm sure will be changed every 12-18 months.
"10S" is now only a mode you can switch out from (if you have the permissions). It just became a mode of organization that don't want to allow user to install software but from an approved source.
Not different from. i.e. ChromeBooks used in a school.
Yes, it depend on the pro, and what kind of work he's doing. Those working on location and shooting non-repeatable events (even down to marriages) can't risk to have a faulty camera. Others have far less issues and often work with far older cameras.
More and more cameras have GPS built in and that allow to use the compass to record even the direction the camera was pointed to - something you can't do with an external GPS. And more and more cameras can be connected to phones to be used to share images on social or the like, using bluetooth and WiFi. But rarely that's a killer features for many photographers, especially those who shoot RAW and need time to process the photos, albeit Adobe is porting its tools to tablets and phones too. But while you can edit well enough on a iPad, it's not as good on a phone.
Pros needing to publish an image quickly usually have a "back office" ready to ingest, process and publish the images sent to them via FTP or HTTP - high-end cameras have specific transmitters to allow that over WiFi or even Ethernet.
One issue is that many "apps" made by camera manufacturer are far from being good - they see them as a cost that doesn't bring in much revenues. Probably it's better if they just implement the image transfer part, and let other better tools kick-in for editing.
If they are standard B/W films, or color ones using the C-41 (negative) or E-6 (postive) processes it's still relatively easy to find labs to process them. If you live in a city large enough probably you can still find a shop which can help you, otherwise you can send them to a lab by courier or mail.
If they are Kodachrome nobody process them anymore (but some just developing the layers as B/W images, AFAIK - the colors were added while developing the film). Some older processes are still handled by a few labs worldwide.
After so many years color films will be quite probably yield color shifts, as the three layers may have aged differently. especially if the film was not kept refrigerated.
The 35mm film required good lenses both for recording and printing. It was a time when large format prints were usually contact prints only, while 70mm required small enlargements. Many camera lenses were very simple designs, and thus cheaper, but often slow - and not good at all to produce images needed to be enlarged. I'm not saying medium and large format films have inherent advantages, but 35mm was still large enough (going below ended in several failures)
Film was back then probably already ahead of lenses, in terms of resolution. But cinema required already good lenses to shoot quick enough well enough on small films.
Leica could create a usable 35mm camera because it had the expertise to create the required lenses. Quickly it also created versatile camera with interchangeable lenses and a rangefinder for precise enough focusing, which otherwise required a fixed camera and a ground glass.
I'm sure many camera makers of the time were terrified by the need to compete with Leica and Zeiss in lens designs. Not surprisingly, both of them weren't born to make camera lenses but scientific instruments.
Today phone makers believe they can use software to bend not light rays, but the law of physics. Let's see....
I'm not sure today a phone is quick enough to frame and shoot the HCB way. Surely Capa would need something more versatile.
The SLR became mature at the end of '70s, when it added full aperture metering, replaced unreliable and delicate meters with silicon ones, and shutters became metallic and vertically operated.
Sorry, the 6D is a cheaper build, but the 5D line is very well built, and has a magnesium-alloy body, although covered by paint and rubber because the camera is "weather-sealed". it doesn't make me regret my old T90 (nicknamed "the tank"), and doesn't feel like a plastic toy (unless you used this: https://shop.usa.canon.com/shop/en/catalog/miniature-5d-mark-iv-model-camera).
And not the 5D alone - comparable models of other brands are built more or less the same.
Sure, they are no longer made of stainless steel, lead, wood and leather...
The problem for Olympus is that "mirrorless" cameras and their lenses can be made quite small still using bigger sensors. You look at Fuji, for example, or some Sony and Canon (the M line) cameras.
Many lenses for "35mm" are huge because they need retrofocus designs to allow for the mirrorbox space. Remove it and you can use far smaller lenses keeping the same aperture. After all the rangefinder cameras were compact and still using 35mm film.
The "tiny film" was the same already used for cinema, where no one complained about it being too tiny (and cinema used it vertically, with smaller frames....) - and their lenses too.
Anyway it was true 35mm rangefinder had less control on the image - Cartier-Bresson and Capa could take photos previously not possible (a viewfinder makes the camera far faster to use), but Weston or Adams could not have worked with 35mm rangefinders. Some issue were solved only when SLRs became available.
About ergonomics, the large format cameras of the time, and even the medium ones, were not examples of ergonomic design - whatever needed to be activated was where it was simpler to put it (it's still that way).
Some actual DSLR are excellent examples of ergonomics - it's just a pity the savings of touch display will mean dials and buttons will be replaced with on screen widgets far less practical and quick to use.
No, the problem is just the market is shrinking and there are space for fewer and fewer players. Canon, Sony and Nikon have now most of the market share. They have all been riding the "digital market" for many years now, well before people started to use phones.
Nobody really cares about "open standard and open applications" but probably some of the 4% of Linux users.
Adobe still have the lion's share of digital photo processing, and it's far from being "open". Many photographers are Apple's customers. and it's far from being "open" too.
Like it or not, photographers usually don't really care about open standards - just look at all the proprietary lens mounts - any attempt to make a standard one failed. Nor Adobe DNG gained much support over proprietary RAW formats.
And when you're handling your camera you can't really use a phone at the same time. Moreover a good camera can last several years (of course a professional will need to replace them more often) - what is the average life of a phone and its apps? How long before the new shiny-shiny phone no longer works with a camera bought five years before?
I believe what they couldn't foresee was the monetization of "user generated" contents (the business then was selling internet access only) - and the contents that monetize better and are more addictive are often those breaking the law - from the copyright infringements a lot of YouTube is built upon, to objectionable contents aimed at children (and often adults too...), or whatever can generate "clicks" and "ads impressions" anyway.
Thinking that companies shielded from any responsibility will self-moderate and cripple their own business is wishful thinking. They will only act when the outrage grows too big.
It's necessary to decide who will bear the responsibility of contents that breaks the law - the publishing platform or the publishing user (and the latter must then be identifiable).
Publishers may decide to bear the responsibility of the contents shielding users who they believe are in need of being kept anonymous - just like the press always did before - but being then responsible they will have the incentives to verify and moderate the contents - even if it means losing some money.
Still, people are usually asked to prove they can drive a car and understand enough of it and the driving rules before being allowed to drive one. Funnily, it doesn't happen with computers.
Nobody is asked to be able to build a computer or repair it, but they should be expected to know how to "drive" it, including some security stuff. Or you mean driver should be exempt, for example to know how to properly park on a slope? Or when tires should be replaced?
Another clueless user who believes Exchange is just a mail server. And that "proprietary" protocols don't exist in FOSS software too. "Proprietary" is whatever is not issued by a standardization body. If for any protocol developers needed to wait for an approved RFC or ANSI/ISO standard IT would have gone nowhere. Exchange, like Notes before it, needs a specific protocol to expose its features, and the old e-mail only protocols are not enough. A protocol can be both proprietary and published (Exchange protocols have been published for years now, although they are not free to use).
Funny thing: how much money universities get from their patents and copyrights, down to their sport teams? Or when the money are used to pay university people that's OK, as if companies could be charities, in Ray Stantz words in Ghostbusters "Personally, I liked the university. They gave us money and facilities; we didn't have to produce anything. You've never been out of college. I've worked in the private sector. They expect results."...
Why no University wrote a great web conference/messaging application and donated it to the world? Why people with such ideas immediately leave the university and setup their own companies to make money???
Thanks to heaven. But Apple was boasting about what people do with its devices - not about how much they blackmailed them and surrendered to Apple. Do people know they could end to pay more because of the Apple Tax? And when private companies were authorized to levy taxes?
Anyway,. quelle surprise, people use mobes and tablets for their businesses too. I would have been surprised if they used a Tamagotchi.
Who under Windows install any browser from the store? Did they submit their applications to the store, or ignored it as most Windows developers do, being not required at all?
You'll get prompted to install Chrome anytime you open Google - and how many software installed Chrome without user approval?
You mean Apple owns the phone even after it sold it to you? Anyway MS never bricked any PC. Of course if you delete in unsupported ways libraries the system deems critical the system won't boot. With any OS. You were still free to install any other OS on your PC - something again Apple tries to forbid you.
The "emeritus" professor looks quite clueless. You didn't need to remove IE either - just install another browser, nobody forbade or forbids it, and forget IE. Which eventually was what most people did.
The problem was that free software usually kills paid for one unless the latter has no real replacement.
There are now more software houses killed by Linux & C. than Microsoft Windows. It is really impossible to make money in some areas now.
A false equivalence? People are so blinded by their hate for Microsoft they can't see the landscape today is even worse. That a fully artificial monopoly, it has nothing of "natural". MS does own the Windows platform just like Apple owns iOS and Google Android. What's the difference?
If you jold relevant patents, you can ask licenses. Look at how much Apple owns to Qualcomm. Any phone maker pays licenses if it has to, and those who can ask the money, why shouldn't they? Actually they damage their companies if they don't.
Apple doesn't license its platform to anybody. Google does, but only because its business is nor hardware nor operating systems, just hoarding user data for profit - so it's actually the most offensive of them all.
Sure, people at MS were stupid. They tried to hinder other companies applications with dirty tricks - they should have just set up a walled garden store and take money from any other application while enforcing than any competing browser should have been build on IE engine....
Microsoft and other software CDs and licenses were sent to the CEO assistant which also kept them. IT made copies of only the ones they routinely used, and when developers needed something outside what IT supplied - i.e. IT only deployed Win95 machines for "hardware compatibility reasons" (and their games, we guessed), while we needed NT4 and its service packs - we were left to deal with that nice woman ourselves, a woman well known for liking to make "underlings" life as much miserable as she could - and being unable to understand what we asked for and why didn't put her usually in a good mood.
She was also in charge of the office furniture - and she liked it to be "prison gray" - but that's another story....
The term must be inclusive. It should be "community", evidently. Don't know if a "community controller" implies inequality, though - and "controlling" others is a bad thing, of course. "Supervisor" might be also somehow linked to slavery, as "overseer" (the term have exactly the same meaning) - as beyond owners and "masters" there was obviously those tasked to make slaves work, and hinder any escape, without "owning" them.
Maybe "community counselor"?
In other meanings "domain" should be replaced with "field" or "matter", I believe....
Then we have to abolish "authoritative" servers because of course there was no democratic vote on establishing them. And many people suffered and were denied freedom and killed under "authoritarian" governments.
Remember when the French Revolution wanted to change names to things to make the world "better", for example the months names? It ended well, didn't it?
Language evolves, sure, but beware of those who want to force changes because they feel they are superior and could make the world "better". It's no surprise Orwell made it a focal point of 1984.
The funny thing is that "blank" means "white" - see "blanc" in French, "blanco" in Spanish, and "bianco" in Italian. Both "white" and "blank" have their etimology in roots that mean "to shine". Empty spaces on paper are "blank" because they usually show "whitespace".
The NewSpeak people should consult with someone knowledgeable in languages. I think they are "linguistic racists" because they think about English only ignoring any other language (LOL!)
Otherwise they should remove the word "album" too. It comes from Latin "albus" (which means exactly "white") and was a white writing tablet. So please, start to burn all your music and photo records....
Ah, and what about "whitepapers"???
There are several negative meanings of the "white" word too. For example, "to whitewash". White is often used for "empty", "blank". It's also connected with cold, fear, ageing, even death - phantoms are usually white. A white flag is a symbol of defeat and surrender. What about "white noise" instead of "random noise"?
Still it's funny to see such furor in a country that still denies basic rights like unionizing, and where companies don't break the law if they actively hinder it.
Maybe the problem is not the words "master" and "slave" - but the underlying society, which still likes slaves under a different name? And hiding the words is just a way to focus people on irrelevant issues instead of tackling the real ones?
Where's a US leader asking for a labor reform in US, bringing it to the level of the most advanced and democratic societies? Yes, the Wall Street "masters" will suffer a little, poor lads. They will still be richer than most workers.
So what about "masterpiece"? The etymology of "master" has nothing to do with slavery. The Latin word "magister" (from which "master" comes from) means both "leader" and "teacher". In Italian the word became "maestro" which is used for an elementary school teacher - or a great practitioner of an art (and the latter does not feel ashamed by the former use).
What about headmaster? Mastering an art is now wrong? What about an audio master? What about a master's degree? What about a ship master? Should we burn any "Mastering...." whatever books?
Also, avoid any use of "patron" and their derivatives - they come from the word "patronus" which is inextricably connected to slavery in ancient Rome - albeit it means "protector", it was also the term used for the previous owner of a libertus, a freed slave. Avoid also any reference to Slav or Slavic - because that is where "slave" come from - in Latin it was "servus" (female "serva"). So let's remove also any reference to it. "Servo controls"? They are exactly slaves.
How far should we go?
Master as an owner of slaves is just a narrow meaning of the term - the problem with the new NewSpeak fanatics is their broad ignorance and consequently their very narrow focus on very simplistic ideas, the only ones they can master....
Adobe doesn't sell camera or lenses - even if you take photographs with a phone you'll need to process them somehow - and Adobe has been busy porting Lightroom and Photoshop to mobes and tablets as well. Still photographers are just a slice of Adobe's Creative Cloud product users.
I've heard anyway many photographers taking this time to go through the images taken in the previous years and making the edits they never did for lack of time.
And the lockdown impacted the sales of new phones as well.
Evidently as soon as enough Adobe users switched to Adobe subscriptions or to something else the curve will flatten - maybe this is just a sign it has begun, most users on a subscription won't cancel it because of the lockdown (maybe more if Adobe Cloud stop working more often...), and while many photographers had a hard time (but many took the time to work on old images) - other creatives were probably less.
LOL, they really think Google is going to pay for open source code it has no use for. And even for the latter it will just do what's in its interest and nothing more. Do they really believe Google is different from Microsoft & C.? At least Microsoft products are (were?) software applications, Google products are user profiles to be used to make money.
Biting the hand that feeds IT © 1998–2020