“Ohmygosh…I lead an overview of some of my favorite windows 7 features! It was totally informal.”

Yesterday I watched a 10 minute video of an HD camera tethered to a weather balloon as it climbed to 100,000ft then plummeted back to Earth. The video was a lot of spinning and whooshing.

I got through all 10 minutes of that video. I could not even get through the first minute of this video without feeling nauseated.

Revision 3: Live Streaming Has A Long Way To Go

I tried to watch a live stream of Revision 3’s Tekzilla yesterday, at revision3.com/watch.

It was absolutely painful!

I was under the impression that Tekzilla was going to a Live format. To my dismay, they were simply broadcast the recording of Tekzilla. This meant getting to see Patrick Norton, redo portions of the show, over and over and over and over again; and other tedious “behind the scenes” stuff.

Maybe this is just a baby step, testing the waters of their streaming infrastructure. But they’ve certainly got a long way to go before they’re on par with the old TSS.

This Might Explain Why Twitter Is Down So Often

Reblogging:

Joyent published an article a month or so ago about how they scaled a facebook application to support millions of hits. The application, BumperSticker, simply serves out customized images to users – online bumper stickers. It’s not hard, not complex and processes around 20 to 27 million page views a day. That’s a good number by anyone’s standards.

But, this dinky little Ruby on Rails app required the following architecture to do it

  • 13 Application servers.
  • 8 Static content asset servers
  • 4 MySql databases

Thats a staggering 25 servers just to serve a bunch of images at a rate of no more than 320 hits per second.