We already had this story!
WANT NEW STORY! WANT NEW STORY!
*hides under the covers while crying loudly*
692 publicly visible posts • joined 3 Aug 2020
The small size of the image sensor and lack of an adjustable aperture are also big problems. The circle of confusion doesn't go away just because you have a lot of pixels, nor does the amount of light hitting the sensor somehow magically increase. There is also a huge restriction on what kinds of photos you can actually take because a phone does not have aperture adjustment; this is of course irrelevent because most phonetographers don't know what that means.
The software seems to be pretty good at giving people "the photo they want, not the photo they took" which is fine if you are not really interesting in photography.
I do wonder if the aperture problem is going to be partially solved using IR cameras. One could judge the depth of each point in the image, then apply blurring to simulate different DoF. I would be surprised if we don't see that soon (or it isn't being done already).
The bridge toll makes perfect sense. Bridge costs f(X) to maintain, where X is the number of cars every year. Unfortunately, f is not a linear function: even if the bridge has nobody going over it, there is still expensive maintenance and inspection that must be done (not to mention recouping the cost of building it, which is completely independent of usage).
It's pretty intuitive that if fewer people are using a fixed-cost resource, the cost to each user must go up.
I've really had enough of salable products. The world is dramatically improved by the existence of non-salable products. Salability only demonstrates that somebody has convinced somebody else that the thing has value. Quite often, that event is completely independent of the thing's actual value.
This effect is not exclusive to the arts.
Crap, so this is why CD-Rs are so expensive here. That is several times the actual cost of manufacture. I can't believe that some of my money has gone to Kanye West because I wanted to install software from companies that no longer exist on computers that are slower than the microcontroller in my fridge.
:(
If this technology matures into something sophisticated and reliable, I might be looking into a new career in 15 years (or at least moving to something like safety-critical systems where this will be a never-ever). Not because I think programmers will become irrelevent, but because it looks incredibly boring. I didn't get into software development so I could glue things together, I did it because I like writing code. If only the computer industry existed purely for my entertainment...
I agree completely. It's all very funny to go "haha, look et zee loyers unt zer seelee games", but the reality is that people employ lawyers for a reason. They know the ins and outs, the risks and benefits. They don't care if we think it's dumb, clearly this was worth their effort to do and I expect they have a good reason (even if it is not the reason stated).
"The bottom line is that millennials and Gen Z especially have digital lives and it's natural to want to take digital representations of luxury brands, music and art into these worlds – and now they can – and this has value."
What? I have been able to bring digital representations of these things into the digital realm for a very long time. I still can. And that whole thing about equating NFTs to the internet...what an ass.
Yes yes, preaching to the choir, not saying anything new. I am just fablerghasted.
I want to acknowledge that the photo management software ACDsee put "performance improvements" front and center in their marketing this year. It was really nice to see somebody acknowledging that this is a feature that people will pay for (and one that the software badly needed).
It is much nicer to use now, though there is still room for improvement.
I think everybody pretty much understands that this is stupid now. I have been working on some very ugly code lately, and generally I am committing files that are several hundred lines shorter. If I didn't refactor out all the cruft, it would have been much harder for me to actually fix/extend the software.
My boss has not complained about this, presumably because she's not a suit.
Yes, when I see a boat travelling across the ocean, my first thought is "damn, I need to invest more in IBM technology to help my business identify new challenges on the horizon".
I mean, I know the logic he is using in making that stretch, but the distance he has to reach for it it is almost as long as the journey being made by the boat.
I don't (thank goodness), but any day of the week I would live in the United States instead of Russia or China. There are many bad things about modern western powers, but I could go to the central park right now and start putting up posters that say "Joe Biden is a dick and he eats your babies". I might get a telling off, I might get some individuals who want to pick a fight with me, but I will not be put in prison or assassinated.
At least a 1050 is ok for the actual modelling. Renders might take a lot longer, and she might eventually hit limits with scene size due to limited VRAM, but depending on the course she might rarely ever be running ray-traced renders or working with large complex scenes. For just floating around the scene modifying things with the cheaper shader modes, The GPU doesn't usually matter very much.
Source: have done a lot of modelling (including a college course) on both a 1050ti and a 1060.
I have an R5K O2, it's what got me into SGIs. Unfortunately the cases are made from really brittle plastic that just disintegrates, so using them as a case is basically a nono. I also don't know how people can ship them safely.
I just enjoy mine for the real-time video features. It looks like garbage :/
Actually, I use an Indy as my...daily driver?...SGI. It's easier to keep running because you can easily run it off a SCSI2SD, and hasn't fallen apart.
A: Keep it under your desk, force them to net into it.
B: If they do that, then for a whole host of reasons they should probably not keep their job (since at that point they are deliberately and knowingly bypassing a fundamental development procedure that is REQUIRED for the programs they write to be useful).
Imagine the conversation:
Me: Why is there 256 MB in the test PC? it was specced at 128 MB.
Dev: My code wouldn't work in 128 MB.
Me: But the user PCs all have 128 MB.
If they continue to argue with you, then they are provably an idiot and you really don't want to keep them around.
When developing Crysis: Warhead, Crytek built a "target PC" that was kinda low-mid end. The final game had to be playable on that machine. That's what I'm talking about. Before release, the dev is required to run their code on the target machine rather than their dev machine.
I must admit, I did do this to a teacher once. I upgraded to 32GB because I was running out of memory for some vector art. It didn't occur to me that the teacher wouldn't have that much, so he had to bring the file to school to view it...I still got a good grade, which was nice.
Eh, it sounds like was was basically on the money. I don't think a tech can reasonably be expected to predict "you only have half a board". He was right that the problem was between the mainboard (or whatever an HP minicomputer has) and the printer, just it was a little bit closer to the mainboard than he thought!
This also reminds me of DB13W3. I have a cable for converting it to VGA, which has a set of switches on it for changing which device it supports. I can't find the instructions right now to check, but I think it has something like 12 different "official" modes for different devices. Of course, most commonly you will just use SGI and SUN.
I am a member of a hackerspace. I wanted to get some of the old SGI kit working. I literally went through every single box that might have cables, taking out anything that could possibly be SCSI. This was not an organized space, so that was a lot of boxes.
I did eventually get it going, and have quite a nice collection of very different cables. 50-pin, 53-pin, 80-pin, VHDCI, parallel-> SCSI, internal and external, terminators, double-ended terminators...fak.