and of course it’s so predictable what movie executives are thinking now

confessions of a pixel pusher marketing

look, there is a head floating Zardoz like over the uncanny valley

Interesting, and I think pretty smart, move on Warners behalf to put a little “Making of” out there.
Something tells me that Warner will not give Panavision the same nice vendor treatment that R&H enjoyed here on this show.

zodiac on showreel and the conform

confessions of a pixel pusher misc

This article in Showreel describes the technical workflow of the movie I worked on in the last months.
Since it was written a couple of months ago it is missing the online part: The movie actually gets assembled via a perlscript and a database: It reads the Final Cut XML. Asks the operator to load the camera negative LTO3 tapes into the tape robots that it needs. Loads those via all available tape drives and assembles the frames to one stream of dpx files. It takes about 20 hours to assemble a reel for the first time. But it can do unsupervised once the tape robot has the tapes it needs. Changes are faster of course. Only the missing material will be loaded. Doing the actual conform from the disk takes less than 2 minutes. For the last weeks I was watching split screens of on- and offline. Seing all cut’s line up was a very very pleasant thing. I handled terrabytes of data before. Those are worth something as well. But a 85 million dollar movie is a different story. Finding all frames where they should be was a big relief: I never trust anything or anybody. Especially my own code is highly suspioius to me. This paranoia got me my first gray har on this movie. But maybe it was also the reason why the ghost of the digital-movie-disaster landed on another movie, not “Zodiac”. Whatever it was, it worked, and that makes me very happy. Having finished on schedule: Priceless.

fxphd

confessions of a pixel pusher

Finally I had some time to check out fxphd and it looks pretty decent.

os x: copy fails after 4gb

Apple confessions of a pixel pusher OSX

When copying a huge file to a drive you could see OS X freak out with a cryptic error messages. Around the lines of “Copy of file XYZ failed with error -NUMBER”.

One possible reason is that the target drive (maybe a shuttle one) is formated as “MS DOS” not as “OS X Journaled”.
Command-I when having the drive selected will tell you.
The file size during that error message is up on the screen will be pretty exactly 4GB. When you close the error then the file will be gone. Of course nothing get’s logged in the system log. Not sure if the finder has a log.

OS X: great unix. Kinda …

where is it?

Apple confessions of a pixel pusher

shanghai -> ancorage -> indianapolis -> los angeles
that’s how far my new computer has traveled so far.
in less than 48 hours.

I still like my 1.5GB / 100GB iBook G4. It worked well during the first 9 months of freelance work.
Looks like I will get my 15″ MacBook Pro just in time when the MacBook’s come out.

software could easily suck less

confessions of a pixel pusher technology

Lazy people suck. Specially if they code something that I am trying to use.

Today I wanted Apple’s shake to read dpx files that I had generated with ImageMagick.
The message I get is:

Dpx reader got an invalid or unsupported encoding value

I don’t mind the error. Fair enough. DPX files can have all sorts of flavors. I don’t expect shake to support them all. What is really really stupid here is the fact that the code finds a value and it does not like it. But it just tells you that. If the coder that wrote this would have any clue then he / she would have included the received value and the range of expected ones in the error. How about the actual DPX header field that this value originates from?
Takes even a moron only 2 minutes to code this, but it would help the whole user community and the coders and the support people to save countless hours.

Stupid lazy people.

Adding -depth 10 did start to create 10 bit dpx files. But shake still was not happy,
I ended up patching the header of the files like:

dpxfileheader.orientheader.XOriginalSize = 1920 ;
dpxfileheader.orientheader.YOriginalSize = 1080 ;
dpxfileheader.genericimageheader.ImageElement[0].Packing = 1;
dpxfileheader.genericimageheader.ImageElement[0].Encoding = 0;
dpxfileheader.genericimageheader.ImageElement[0].DataOffset = dpxfileheader.genericheader.ImageOffset;

and then it worked.

time code converter

confessions of a pixel pusher

since I have to calculate from 24 fps timecode and back allot for the movie,
and since there seem not to be any decent simple tools for this, I hacked one in 2 minutes:
timecode < => framenumber

fps 24, nothing else. Again, a 2 minute hack.

red

confessions of a pixel pusher technology

big deal: the founder of Oakley (as in Sun Glasses) decided to bless the world with a new camera. Super 35 sized CMOS sensor, 2K @ 120fps, 4K @ 60fps, 17,500 US$ price, done by the end of the year.

So they say.

I say: Bullshit.

Naked Emperor number 1.

Absolutely ridicolous. Of course it would be nice if such a device would exist within these parameters. People want to believe it in, hence the hype. The hardcore fans can get a serial number reservation for a mere 1000 US$.

I find it amazing how quickly clever marketing can get you such a fanboy following. As of today there is a cad model of the body of the camera. Which also happens to be where the core competence of the company behind the thing (sunglasses!) ends. They say they will have a lens for 4,500. Of course sunglass -> lens. About the same, right?

The core of the red-1 is the ‘mysterium sensor’ capable of shooting 4K and having full super 35 size. Not much more is known about this. Real life problem is, that it is very hard to make a chip that works at this size. Yield becomes a real problem. Nikon just abandoned full size chips in favor for the ASP ones. That means that they more or less left 40 years of lens buyers lying in the dust. If they could have avoided that, they would have. But the owner of Oakley has a 1000 cameras, so that qualifies, right? Well, actually, it’s the other round: What do you need that many cameras for? Oh, well.

Next phase: 4K @ 60 fps or 2K @ 120fps. Whoa. First of all those are big numbers. Secondly: they don’t make sense: 4K is not twice as much resolution, but 4 times more than 2k. So the 4k mode has double the bandwidth needs than the 2K one. If bandwidth should be the bottleneck then 2K might run @ 240fps. I think this little oversight shows how much the red camera is vapor. And within 7 months it has to work? Laughable!

The Mysterium Sensor (there words not mine) is supposed to have 4520 by 2540 resolution. There are bigger and higher res chips around. But this one can generate -so they say- 60 images a second. Let’s assume that they use 14 bit per channel. The data flow would be 4520 (width) * 2540 (height) * 3 (rgb?) * 14 (bits) * 60 (fps) = 28931616000 or 3.6 Gigabytes per second.
‘Red’ is quick to say that you can compress this data. But at some point you have to handle this amount of data, within that little cage. Great. Mysterirum DSP? In comparsion a HD 12bit stream at 1080 24p results in 223 Megabytes/second. So “RED” can handle 16 times more data than the cameras used on major features right now. Cameras that cost 8 times more.
Great. Maybe “Red” should have started out with something easier, like a flying car or something.

The third area is equally odd: A camera never lives alone. Lot’s of equipment makes it a system. Stuff goes in and out. Like sync, like timecode, like audio. Red performs a mircale again: Every option conceivable is available. From a “red raid” the can capture those 3.8GB/s of data to a intnernal drive that operates compressed. It’s all just there. Or, ahem, will be by the end of the year. Of course the camera supports all existing lenses. Not just a few one, no, all.

“Red” is applied wishfull thinking. If somebody would be able to pull of such a leap ahead, then maybe they should have chosen another area to do so. After all the “Red” team knows as much about cameras as they do about any other topic you might pick.

Just for the record:
There will be no working sellable red camera operating at 4520 * 2540 * 60 fps in the promised 11-15 f stops uncompressed ‘444’ by the end of the year 2006. You would need 8 4Gb Fibrechannel interface to transport that amount of data.

Crazy how gullible people are. I hope you come back here in 2007 and read this and compare it to the red-realities that will unfold.

zodiac on BlogsNow

coming to a museum near you confessions of a pixel pusher

It is interesting to see when what I do to merges in this way: BlogsNow links for David Finchers Zodiac clip

“Zodiac” has kept me very busy since last Summer. Since it is the first major studio movie that has never seen tape [ except for archival ] there were lots of things to be written for it. I am actually still writing tools for it.

[software] raid 1 with debian

confessions of a pixel pusher

‘1000 marketing people at the bottom of the ocean’ -> a good start.

So, I needed a couple of servers. Had my trusted hardware vendor slap them together. They felt that I could use the ‘Intel server boards’. Since the price was ok, why not. And in the end they are not bad: 4x Sata, dual gigE, overall standard stuff. I had been told (by its builder and by the boot screen) that the motherboard would have a ‘Raid controller’.
“Cool” me thinks: Hardware raid, one thing less to worry about. Turns out it’s a ‘marketing raid’ that intel present here. There are drivers (no source …) that play together with the bios a software raid. Which is maybe ok for Windows, but certainly stupid for Linux: mdadm & Co work really well and are equally well documented.

Next time sink was the fact that I ended up being conservative in picking the debian ‘sub distro’.
Only after picking the future ‘etch’ that is currently the ‘testing/unstable’ stream things worked fine. With this iso
I could make the software raid right in the install menu. Before I wasted pretty much two days compiling kernels, installing lilo, compiling more kernels etc. Google shows you years worth of hacks and workarounds. Just that they are obsolete by now.