the accidental screamer

confessions of a pixel pusher linux technology

Needed to build a new NAS server with safe raid storage. It’s more or less a near line storage solution, so I tried to go for best price per Terrabyte. Just before it dissapears into what will be hopefully years of uninterupted service I snatched it’s keys and took it for a spin on the weekend. So to say. I am still tweaking things, but right now I get just a hunch more than 600 MBytes a second sustained writes xfs.

Which is actually quiet awesome, considering that there is not a single SCSI disk to be found in the case. We paid a very reasonable price for the net 6Terrabytes we got. In theory this machine could record 3 streams of 1920x1080x23.98 10bit dpx frames. For 3 hours.

driving around with the handbrake firmly engaged

linux technology

What do I know about computers? I mean, really. So I build this rather big machine, to read along a couple of million weblogs. Needs storage. Sure. I get a 3ware raid controller. Works like a charm btw. There are more blogs, there is more spam, nothing surprising or new. The machine start to have a load of a solid 95-100%. Well, linux can deal with that, and it can. Today I tune another server, and look at parameters. One of them is the scheduler that is used to do the actual IO. The default for my kernel was anticipatory. I changed that, so that
cat /sys/block/sda/queue/scheduler

reads now

noop anticipatory [deadline] cfq

And, what a surprise, the IO load starts to decline and the CPU is idle for 10-15% again! Of course this makes only sense on a server with lots of IO and database activity. That poor machine had to do stupid things for years.

As I said: what do I know about computers? Academic, yet interesting question would be how many CPU cycles are actually wasted on things like this, how many are needed?

limits: disk size and imagination

history linux technology

Some days things have the feel of a ‘techno groundhog day’. Once again I set up a computer. Once again it has a considerable sized disk system. It used to be that 30Megabyte (no typo) Winchester disk. Today it’s that 8TB raid. And the problem remains the same: The tools choke on the size. I forget what it was twenty years ago. It was not as easy as it should have been. And that did not change. To cut to chase of the technical knowledge that might be helpful now and will certainly be laughing stock in the future (30MB to big: hahaha):
Getting a 3ware 9550… with 16x500GB drives is a good idea. Fits in one nice case and in a Raid 50 config you end up with 6.3TB usable capacity. Historically it needs to run Fedora Core 4. Which is happy to find the array after the installer has been launched with linux dd and a proper floppy drive (!!) has been inserted with the 9550 drivers. The next mistake one can make (and I sure did) is to let the installer automatically partition the drive it found. Knowing that big disk systems can be trouble to start the OS from I already had seperated a 80GB boot partition in the 3ware bios. The Installer went along, formatted the whole thing and did it’s install. Which take some 6 hours I would guess.
Only problem was, that the poor thing could not boot from what it had made. The automatic partition manager was utterly confusde by the size of drives it found, but didn’t let that stop it from trying and failing hours later anyway.
Manual partition of the 80GB boot drive got me over that part. Having an OS to boot: priceless.
The data partition only started working after using parted and a crucial ‘mklabel gpt’. Only then it would accept the size of the partition correctly. Otherwise it was silently reducing it, and then would fail to mount after a reboot.

Sofar the gory technical details.

The bigger problem is:

Disks have become bigger. Ever since computers are around. Everybody knows this, is exposed to this, and benefits from it. The big question is, how can you write a software that deals with the nuts of bolts of disk systems and not be freaking prepared for that? Of course that 30MB harddrive I dealt with 20 years ago would have been a bit overwelmed to run a partition scheme that would be ready to hold 6 Terrabytes. First question is: Would it be really? Sometime people are scared of wasting 3% but waste the future of something. This side of the equation can be argued with.

There can not be ANY execuse for the way systems fail on bigger hard drive: Numbers roll over, systems report -1600% free space. Shit like this is unacceptable. Tremendously stupid. If you code like that, then you should not code. Period.
Disks will be bigger tomorrow. Deal with it. At least create an error message along the lines of “Can not create partition bigger than 2TB” etc. Fail gracefully. You might have not the money to buy enough disks to test it, but you CAN put in checks for these limits. Nobody will slip in an extra 10% ‘integer boost’ to help your code out. The limits are what they are today. Shame on the authors of the tools for the lack of imagination. If physical harddrives can catch their code only after a few years like they do I am actually surprised that y2k did so little damage …

all your content are belong to us

internet media technology

Entering

star trek closer video

into google will provide you with lots of different sources for a rather entertaining mashup. It’s interesting how content that is able to strike a cord with people will propagate into lots and lots of outlets. Virtually impossible to control.

red: great, no need to eat my hat

technology

I had not thought that RED would show images @ IBC. I thought I needed to get ready to eat my hat on that one, when they actually did. The footage itself I have not seen, just the the youTube version. Interesting choice of subjects I would say.

Now looking at the images of the actual device over here I think that RED is back in the bin where I had them @ NAB: Funky billionaire bubble stuff. No practicle application that I could think of.

Looking at the thing I wonder what kind of camera operator they would have in mind? By the time you are done you have to spend between 50 and 100K for the kit. You want to be caught with some funky lump of metal for this kind of money? I didn’t think so either.

if this works it will end the ‘format war’

media technology

Toshiba and Memory Tech announce a HD-DVD / DVD disc

They call it ‘tripple layer’. It plays in DVD and in the new HD-DVD players. Which is the end of the usual chicken and egg drama with new formats: Not enough players, not enough discs. And it finally respects the consumers.

Every launch of new technology pretends as if people have nothing in their home. Just empty shelf space to fill with their products. With a DVD/HD-DVD combo disc there would be room for a transition. If the Studios would release movies on these combo discs and would not charge more for it, then the format war would be over, and everybody, including Hollywood would be happy.

Technically it is not possible for Blu-Ray to pull this stunt. Given the impact that this would have if executed right it is very surprising how the HD-DVD could take so long to get going on this.

Update: Arc Technica has the better view on this matter.

geek fight!

communication internet technology

If you enjoy people getting at each others throats and like geek subjects (that’s all four of you) then you could follow the discussion between Joel and David.

I think it’s about Ruby on Rails. I didn’t read it. Although Joel writes usually nice, and David really loves Ruby. He is the one that came up with the Quicktime screen capture movies that show how to do things with Rails. Better get over there quickly, usually smart people ( they both certainly are ) realize very quickly that fighting is not the smartest thing to do. I think they will find a peaceful agreement, or at least will leave each other alone pretty quickly. Then you have to turn to politics again for a good fight. Actually those are not as good, since they are deeply rooted in stupidity.

doa

Sony technology

“It’s Living” Sony’s slogan for the upcoming PS3

“It’s dead, alright” me saying that PS3 might sink Sony much quicker than people think.

Sony said that they would have 2 Million PS3 on launch, 2 by the end of the year and another 2 by the end of March ’07. Of course they think that all of them will be sold. There are more than 200 Million PS2 consoles out there. To launch a next gen product with 1% of the existing install base seems reasonable. You expect a healthy run on those precious devices. And they feature a Blu-Ray DVD player. Standalone players retail for around one thousand dollars. The PS3 only for 600. What’s there not to love?

A whole lot. It’s September, and officially Sony has not started to make the thing. They have ten weeks to produce two Million units of something that they have not made yet. At this years E3 they showed playable Dev kits (think full size PC) and bluray players inside of the nifty designed cases. It might be very well the shape and design of the box that will make the PS3 the disaster that might take Sony out within a few months. Please note the the double might in the last sentence. Sony is a huge company, how could they fail so disastrous? It’s unlikely, but also the possible truth.

The PS2 saved Sony. The win of that ‘console war’ helped to hide other disasters that the company experienced. On the heights of the PS2 internal and external Sony-might Kutaragi could pretty much ask for anything in order to ‘secure’ Sony’s dominance of the gaming sector. Naturally the next generation console would be the battlefield that needed to be defended and won. Sony planned to throw around it’s might and simply went off and invented a new type of CPU. Over four years IBM invested 400 Million US$ into this thing. First application: Millions of PS3s.

Now let’s suppose that things went ok, just not ideal with the Cell. That happens. Actually all things that are visible to the general public indicate that that is what happened. And this might sink Sony quicker than anybody could imagine. In 12 weeks we will know more. In twelve weeks there should be millions of PS3s in homes around the globe, computing the hell out of everything. Sony’s PS3 is expected to ‘awe’ every viewer. It was the japanese electronics behemoth itself that set this level of expectations in widely recognized speeches and announcements.

Back to the ‘what if the Cell was only ok, not perfect’ scenario. The XBox 360 had won the launch time race. Sony countered by releasing enormous numbers for their next generation consoles. Coupled with equally enormous prices. They also revealed a design. Being in the business of making decent looking things for years it was a logical step to present the public of how the thing will look like.

Sony could be in the following simple situation right now: The thing simply does not work. Putting the Cell CPU in a case like they envisioned will melt it. The Xbox had thermal problems. The PS3 case is very small, and has no visible fans. Again: Nobody has ever seen a working PS3 in public. Ten weeks before two Million should hit the street. The yields on CPUs and BluRay diodes are supposed to be very low. If the current Cell does not work in the current case Sony has not many options. They are notoriously bad in pro active crisis management. The last push in the release date came pretty much exactly when the console was to be released.

Of course the likely hood of the instant Sony melt is not high. But it’s not entirely unlikely either. If Cell and cases simply would not work that would explain why Sony did not start making the devices yet. Instead of claiming such simple and embarrassing reasons they might point to a shortage in blue diodes for the Bluray drives as a reason for shortages. Would make the thing more ‘precious’. And would leave less room for people complaining about their overheating devices.

Even if they get the 6 Million units done and sold by March 07 I don’t think that Sony can repeat the PS2 with the PS3. The world has changed. PCs and their gfx cards will soon be much faster than any game console. Gaming consoles have excluded themselfs from the pace of upgrade cycles that are possible in the PC landscape. Good and bad, but fatal for the PS3 in the mid future. If it it’s not DOA that is.

Update:
two minutes after posting this rant I came accross this image of the PS3. Interesting amount of holes in the side there.

slim and cheap server

confessions of a pixel pusher linux technology

Of course 1.50 US$ a GB is ridicolous

But the whole concept of a 1U quad drive cheap-o system seems intruiging: Raid cards are still expensive. They certainly deliver in many cases the best solution. But 3TB (4 x 750) cheap ‘scratch space’ for data that can be recreated could certainly exist in a 1U box for a pretty sweet price point. Sacrificing 25% storage and you have save space.

And as long Moors Law keeps deflating disk and system prices it is still the best strategy to buy as little storage as late as possible. To paraphrase Einstein just not to late or to little.

and the first use will be …

media technology

Lumalive by Philips.

Somebody will have the ‘creative’ and ‘orginal’ idea to sell advertisement on garment.