I recently talked about how I implemented “LWB”, the software that powers this blog, using Go. Now, I'm happy to announce that I've now released it as open source - well, technically I released it a few weeks ago…
The code is all hosted on GitHub, along with a sample blog that looks very much like this one :-)
So after a bunch of feedback, today I added a proposal to the WHATWGwiki — hopefully it won't get crushed too hard and we can get this into the HTML5 spec soon.
The key points are the ability to monitor download and decode bitrates as well as the presentation statistics and an interesting little metric called “jitter” that can give the developer feedback on perceived framerate quality.
Anyhow, take a look and give feedback on the list.
A few weeks ago I noted that I really wanted to rewrite the blogging platform that I use for “Random Thoughts” from scratch. A number of reasons prompted this:
It's way too slow. Page load times are awful.
It's way too complicated. MovableType is written in a mixture of perl and php and as a platform it has become way too bloated for my own personal use. I have little clue how it works and little desire to learn.
It needs a full instance of MySQL behind it, with all the associated management headaches that come along with it.
But mostly I wanted to just write my own.
The first order of business was which language to use. I wanted to learn something new.
I first contemplated using Scala, and had actually gone down that route for a while. Unfortunately my brain exploded after a couple of nights coding.
Scala is pretty and all, but damn does it have a steep learning curve, even for someone moderately familiar with functional languages and very familiar with Java. I was spending way too long digging around in library code and documentation and still couldn't figure out how a simple generator pattern worked. Seriously, what breakOut is doing here?
scala> val s = Set(1, 2, 3)
s: scala.collection.immutable.Set[Int] = Set(1, 2, 3)
scala> val m: Map[Int, Int] = s.map(k => (k, k * 10))(breakOut)
Anyhow, I gave up. To much reading, not enough coding.
I've been wanting to get my teeth into Go for a while and this seemed like the perfect project. After an initial ramp up reading the superb documentation and dicking around with a few toy samples I was off to the races. Code was figuratively dripping from my fingers.
One big choice was still to be made, though: What database system should I use to store the blog entries in?
On the face of it, there seems an obvious answer: Just use MySQL. The bindings exist and I know how to use it.
But then I stood back and thought about it.
There's only just over one thousand posts and it's not likely to increase by a factor of ten any time soon.
The indexes needed are well know ahead of time: by date, by tag and by category.
So why not just store them all as flat files? And that's what I did. Each post is a separate file that contains a single JSON object representing the post. I load them all up at startup and pre-build all the indexes. Nice and fast and works like a charm - no dependencies needed.
The only slightly tricky thing I needed to write was a python script that converts the MovableType export file to the JSON files.
When it comes to actually serving the pages, Go's built in http library was almost perfect, but I ended up using a framework developed by a friend of mine: Gary Burd's Twister. The library provides lots of fun stuff for URL handlers, pattern matching, default handlers including redirects, etc… Highly recommended.
I use Textile as the markup language for all my blog posts, so I needed a Go template formatter for it. Unfortunately none currently exist that I can find, so of course I had to write my own and this ended up taking more time than any other piece of this project, and it only supports the small subset of Textile that I actually use. I sincerely hope that someone else writes a full implementation of the textile parser for Go so that I can throw my own away…
A key goal of this whole project was to reduce the latency of my blog down to as close to nothing as possible. Unfortunately the textile parser and page rendering take time, especially for some of the larger archive pages. So I needed to cache, but what to use?
An obvious choice here is Memcached, but there's only one instance of the server and it adds another moving piece. So what the hell, I might as well just cache the contents of each page render and the conversion of each post from textile to html (as each post appears on multiple archive pages as well as it's own page). A quick back of the envelope calculation yields a few tens of megabytes needed. No biggy.
As I was rewriting the blogging platform, I might as well do a complete revamp of the layout of the blog. So the design is a complete rewrite using HTML5, using all the key buzzwords including web fonts. Take a look at the colophon for more details.
Was it worth it?
Absolutely. The text you're currently reading came from the platform in one form or another, so it seems to be working.
The only real missing piece is on the authoring side. Right now I just run a python script to create each new entry and emacs to edit the content. Not big deal for me, but it would be nice to provide API access so that tools such as MarsEdit (my favourite blog content creation tool) can be used with it.
And of course it needs a name. Software, I christen thee “Light Weight Blogging”.
When we moved into our place a few years ago I was happy to find that the place was wired with Cat5e, all run back to a central wiring closest in the house. Unfortunately though, every jack in the house was wired to phone (RJ11) jacks except for two - one in the office and one by the kitchen desk - I really want RJ45 everywhere.
This was OK, as I just put the WiFi router in the closest and a 5 port switch in the office and all was fine.
It still bugged me though. I want to put a PC in the playroom for the kids (Hi Bila!) and an AppleTV in the bedroom. Problem is that the WiFi upstairs is spotty. I really wanted some wired ethernet for those devices.
Now I was on a mission :-)
The first problem was that none of the cabling used for telephones was labelled. Sigh. Why don't the builders do this? So I needed to trace the wiring. I picked up cabling tester from Frys (the variety where you stick a remote on one end and the main testing unit on the other) along with a crimp tool and various RJ45 and RJ11 blanks1.
First step was to figure out the cabling. I wanted to avoid pulling all the sockets from the walls so I built an RJ45 to RJ11 cable (pretty simple really, the RJ11 is just missing the 2 outside wires from the standard 8 wires in the cable). This allowed me to connect the tester to the phone jacks.
Next I pulled each cable, one at a time, from the phone punchdown block in the closet, crimped on an RJ45 and attached the tester to it. Next I ran around the house testing each jack.
There were eight of them and of course the bedroom jack (which was the one I really wanted to find) was the last one. I got a lot of crimping practice.
And that was it really, all the cable runs are now labelled, the bedroom has a working ethernet jack and Netflix is available in the bedroom. A fun little project. Now I just need a real RJ45 patch panel for the wiring closet…
1 Btw, I really liked these EZ Jack RJ45 and RJ11 connectors that I found at Vetco. They make building the cables a lot easier. ↩
Every now and then, I think to myself “Steve, the blogging software you use on this site is horribly complicated. Why don't you write your own? I'd be a fun thing to build in some new language that you've been meaning to learn, like Scala, Ruby, etc… You could even host it on AppEngine!”
Then I start thinking about things like how comments are an absolute pain as you'd need to deal with XSS, etc… and I go off and think about something else.
Recently though, I thought “Comments are a pain. They currently look ugly, aren't threaded and I get an incredible amount of spam to deal with. Wouldn't it be great to offload that?”.
Nanotechnology has been a fascination of mine since 1986 when I first read K. Eric Drexler's wonderful book Engines of Creation. If you haven't read it, I highly recommend that you do. It's a wild ride through our probable future.
Anyhow, I was particularly struck by one of the paragraphs in the wikipedia article that alludes to the class of bugs that all of us as programmers strive to avoid:
In a History Channel broadcast, grey goo is referred to in a futuristic doomsday scenario: “In a common practice, billions of nanobots are released to clean up an oil spill off the coast of Louisiana. However, due to a programming error, the nanobots devour all carbon based objects, instead of just the hydrocarbons of the oil. The nanobots destroy everything, all the while, replicating themselves. Within days, the planet is turned to dust.”
When I was in Boulder last week, a colleague mentioned that he had a Kindle, so I got a chance to play with one - it's a lot smaller than I imagined and not as bad looking as I had expected from reading reviews.
A few months ago, I posted a link to an article deriding the current focus on Java in schools. Here's a quote from that article:
…Computer Science (CS) education is neglecting basic skills, in particular in the areas of programming and formal methods. We consider that the general adoption of Java as a first programming language is in part responsible for this decline. We examine briefly the set of programming skills that should be part of every software professional's repertoire.
An interesting read. What's more interesting (at least to me) is that I used to agree whole heartedly with it's sentiments. However, I've recently been writing a lot more code in Java than C++ and interestingly, I'm getting to like and appreciate it more - appreciate it's power, expressiveness and yes, performance.
Anyhow, today along came another article in a similar vein.
In essence, he said that today's Java-savvy college grad is tomorrow's pizza delivery man. Their skills are so easily outsourced that they're heading for near-term obsolescence.
Dewar stresses that he's not against Java itself. But the fact that Java is taught as the core language in so many colleges is resulting in a weak field of computer science grads, he says.
Later on, we are told:
“Furthermore, Java is mainly used in Web applications that are mostly fairly trivial,” Dewar says, with his characteristic candor. “If all we do is train students to be able to do simple Web programming in Java, they won't get jobs, since those are the jobs that can be easily outsourced. What we need are software engineers who understand how to build complex systems.”
Dewar obviously hasn't been out in industry very much recently. I know of quite a few very complex systems implemented in Java…
A couple of posts ago, I talked about the need for a new A/V receiver. The basic problem is that I now have a fair few input devices and two of them have HDMI outputs, whereas my TV only has one HDMI input. It'd be nice to route the video and audio signals through the same place.
Mismatch of what's being shown to what's being heard. You know the problem - you hit 'TiVo' on the universal remote and you get the TiVo coming out of the speakers and the AppleTV on the display. The Logitech remote handles that nicely with it's 'help' button, but it's still a pain.
If you don't use the remote you've got to switch both the TV and the receiver.
My TV only has one HDMI input which means that the TiVo is plugged into that so that I can watch HD content (HDCP being required). This means that I can't watch HD content from the iTunes Movie store as it also requires HDCP - having the receiver route the video signal solves this problem.
I wanted outside speakers and to have different things playing inside and outside. I.e. the kids can watch Dora on the TV while we're listening to music outside courtesy of the AppleTV.
So I picked up the Denon AVR-3808CI and all my problems are solved. I also got a few bonuses into the bargain:
TiVo and AppleTV hooked up via HDMI. Xbox360, Wii and DVD player hooked up via component and sundry audio connections.
Outside speakers can play audio sources independently of what's playing in the living room.
Switching inputs is a breeze.
The unit has an ethernet port and can stream internet radio. You can also manage it via a builtin webserver. I must say though, the design of the web content served leaves a lot to be desired.
Nice GUI. As the unit is routing the video, it overlays it's own GUI on top, such as audio levels and it's setup/management interface.
The two spare HDMI inputs are also handy, as I suppose I'm going to have to get a Blu-ray player at some point…
Overall, pretty nice. It even met with spousal approval!
I did have one problem during setup though. The multi-zone support is nice and the idea was that I could be outside and listen to music from the AppleTV. Unfortunately it took me ages to get working.
The problem was that the Denon unit wouldn't let me send audio sourced from an HDMI input (which the AppleTV is connected to) out to the second zone. I have no idea why.
To solve the problem I simply connected the units using an optical cable and had the receiver use that as the audio source for the AppleTV rather than the HDMI audio source. At that point, it would route the audio outside.
Anyhow, highly recommended.
And after setting it all up, I realised that every piece of A/V gear except for the DVD player in the living room is now on the internet - the Denon receiver, TiVo, AppleTV, Xbox360 and the Wii.
On a related note, you might be thinking “Well, if the kids are watching the TiVo in the living room and you're sat outside supping on an adult beverage, how do you control the AppleTV, sucker? The interface for it is on the TV which is being used by the kids!”
Well, I have my iPhone. On that iPhone I have the 'Remote' application which lets me browse the content and control the operation of the AppleTV over WiFi. And I can do it whilst sat outside supping on said adult beverage!
I love programming puzzles - you get to stretch your coder muscles :-) Anyhow I really liked the first question in the 2008 Google Treasure Hunt - the Robot puzzle. Writing code to solve it was an nice little challenge as there's a couple of interesting real-world problems in there - especially if you write the code in C/C++…
I thought I'd done pretty well until a collegue of mine pointed out that there was a, ahem, tangential way of solving it…
I'm a huge audio geek, but just prior to the house move last June, I upgraded to Vista. I've been very happy with Vista - it's worked well.
But with one exception.
I can't get any of my audio gear to run with it.
This hasn't been a problem up until now, as I've been way busy with other stuff. I've been letting my podcasts lapse and just noodling around with my guitar and other audio gear offline, not recording anything. But now, my esteemed amigo and I plan on collaborating long distance on some song writing.
This means I need to get it all running again.
The hub is the EMU 1820 which I use to get all audio into and out of the PC. First of all there weren't any drivers, but now the drivers exist but bluescreen my Vista box during install. I've never managed to get drivers installed for the UAD-1 either. I haven't even tried to get Cubase running.
Which leaves me with a conundrum.
Do I just downgrade to XP? I know it will all work, but I do love the UI in Vista. Do I actually use Vista though? Nope. Just a few games, Skype and Firefox. For everything else I use my MacBook Pro and my Linux box.
In the spirit of spending as much free time as possible, I've moved this blog and my other sites to Media Temple.
Hopefully you didn't notice a thing change except for a decrease in latency - the ping times have dropped from 80 milliseconds to 40 (at least from my house). The servers also appear to be a lot more speedy.
The other advantage (I hope) is an increase in availability. My old server was forever getting crushed by comment spammers - Media Temple's grid stuff should be able to take the punishment.
The plan was to just copy over all the entire installation from backups and just run it. But, of course, things never go to plan.
First up, this is an x64 installation and of course the ppc binaries won't run. No problems thought I. I'll just apt-get install samba and all will be well.
No joy. The supported installation of samba in the latest version of debian is actually older than the version I was running on my old machine.
Ok, so we're building from source again.
Build, build, run.
Nope. I could variously get my desktop joined to the domain, then the NAS. When I finally got them both joined, I ran into the dreaded NTSTATUSNOLOGONSERVERS problem. Again.
So, thought I. Lets just create a fresh domain. This was actually the best solution. My only worry was losing access to data on the NAS due to permissions problems. Handily, the latest firmware upgrade to the Infrant NAS can give you (via an add-on) root shell access to the NAS, so I figured I could fix it up later.
I then brought up a fresh domain, joined all the machines and all was well. Pretty easy really. I fixed up all the permissions problems on the NAS by just ssh'ing in and running chown -R steve.“domain users” share/* on all the shares in /c. The only minor other thing I had to do on the NAS was move my home domain share from /c/home/OLD_DOMAIN/steve to /c/home/NEW_DOMAIN/steve and then chown -R steve.nogroup /c/home/NEW_DOMAIN/steve.
Now, I want to be able to perform the rest of the setup remotely, so install ssh and friends in order to ssh into the system.
apt-get install ssh
Joy, now I can login and perform the rest of the installation remotely rather than at the console.
The clock needs to be set right, so:
apt-get install ntpdate
Yes, I used Microsoft's time server - it's the only one I can remember off the top of my head!
Next, I need to get the network time service (NTP) running on the machine. It will be providing time services to all other machines on the network and periodically setting it's own time against the root time servers.
apt-get install ntp ntp-doc
You'll need to edit /etc/ntpd.conf and then /etc/init.d/ntpd restart to get it to notice the changes. Note that pretty much everything I talk about here has either a config file in /etc or it's own directory of config files, also in /etc. They're pretty self-explanatory - just take a look at the config files themselves and the related documentation. Everything also has a script located in /etc/init.d to control it's operation.
For this reinstall, I just diff'd my backed up config files against the newly installed files to make sure there wasn't anything new that I needed to be aware of and then just copied my old files over and restarted the service.
Next up, bind - the DNS server. I have a local DNS domain in my house that all the clients have an entry in, the linux box serves up that domain and caches domain requests so that the only nameserver the client machines need to know about is this linux box.
apt-get install bind9 bind9-doc
Bind is probably the hardest thing to configure. Handily I had all my backups (yay, me!). I'll probably write up a post dedicated to that at some point, though one thing did bite me a little: if you're restoring your configuration files from backup and get an auth error when trying to restart or reload the server, just killall named and start it up fresh as the authorization key in /etc/bind/rndc.key probably changed when you copied across the old data.
At this point, edit /etc/resolv.conf and point the nameserver line at the localhost, 127.0.0.1 so that client binaries on the system itself use your shiny new nameserver.
Next up DHCP. This is a little service that client machines use to get an IP address. In a home environment this is normally handled by your wireless or broadband router, but I prefer to have the server do it as other useful information, such as name and time server information is also passed to the client. Configuration is fairly simple - check out the documentation.
apt-get install dhcp3-server
Sweet! The base services are now all configured. At this point it's probably a good idea to reboot the server to make sure all these services come up nice and cleanly.
A Few Other Things That I Do
I like to be able to mount drives from other machines on the linux box. For example, my Infrant NAS exports a “backup” share that the server backs itself up to. I use autofs for this.
apt-get install autofs
Edit /etc/auto.master and un-comment the line for auto.net. The backup share is now available at /net/blob/c/backup. FYI, 'Blob' is the name of my Infrant NAS box…
After that it was just a matter of reinstalling my crontabs from backup and then this blog and a few other things are automatically backed up to the NAS. Cool. Safety is back…
Another thing to mention is that I use Amazon S3 to backup my photos and videos. The scripts that do that are written in ruby, so that also needs to be installed.
apt-get install ruby rubygems
I need rubygems installed as it brings with it the openssl ruby package.
The last thing (modulo Samba), that I need is dynamic DNS updating. I use dyndns.org so that I can have a friendly DNS same to connect to the server when I'm not at home. The linux box handles updating the DynDns database with whatever IP address Verizon happens to be giving me at the time of update. I use inadyn to accomplish this.
apt-get install inadyn
Unfortunately, inadyn doesn't come with any form of script to get it started, or any useful documentation whatsoever. So I just copied an existing script in /etc/init.d and got it going with a few minor modifications. Let me know if you're interested in a copy.
All in all, the entire process took me about an hour to get everything setup once the base linux system was successfully installed.
At the end of yesterday's post I was planning to take a trip to Fry's this morning to purchase a new ethernet card for my new machine. This morning's realization is that the new machine actually has two spare standard PCI slots and I have a load of PCI ethernet cards sitting in boxes and dead machines. One of those spare cards happens to be old trustworthy Intel Pro 1000.
I disabled the motherboard integrated ethernet adapter, installed ye olde Intel card and rebooted.
Now that I have a working network during install, package configuration during Debian setup is working. I just installed the base desktop package as I'll manually install and configure everything else later and write up the process for my records and your reading pleasure.
It appears from reading various threads over the net that the Linux sky2 driver in conjunction with the Marvell 88E8056 Gigabit ethernet controller results in a steamy pile of poo.
I can confirm that this is indeed the case.
Again following up from my previous posts, the next step in home linux server resurrection was to attempt to install Ubuntu 7.10 x64 edition on my shiny new machine. This didn't get very far. A boot from CD ended up in a wedged machine no matter how I tried to run the install.
Hmmm, I think I'll go back to the trusty Debian distribution. I downloaded the latest stable (Etchy) net install disk. Boot. Joy! It all runs. Except the net card. Lots of kernel errors regarding the ethernet driver followed by a fatal crash.
Maybe I'll try Lenny, the “in test” release.
Tomorrow I think I'll be buying a vanilla net card for this puppy. Something made by Intel. For now I think I'll just install Etchy on the machine with the ethernet disabled and let the machine burn in.
Sigh - three days into this and it seems like the box isn't going to be up and running before my next trip. I thought this was supposed to be easy?
Continuing the saga of yesterday, I took the new machine back to Hard Drives Northwest early this afternoon where they very helpfully plugged the machine and watched it boot fine. They were very helpful and suggested that it might be a power problem - I had been powering the machine through a UPS.
So I went back home, plugged the machine directly into the wall and watched it fail to POST again.
Just a flashing cursor.
Then I had a little brainwave. It's plugged into a 24“ monitor. Could that be it? Surely not - it's connected via VGA.
Unconnect video cable. Power on. Wait 10 seconds. Plug in video cable. Success!
Being an ex-Graphics guys I've a ton of old video cards lying around, so I plugged in a spare ATI PCI Express card, plugged the monitor into that and I'm off to the races! The machine at least POSTs now and I can get into the BIOS.
In summary (and to capture this post in search results for others to find), connecting a Dell 2405FPW 24” monitor into the onboard Intel Extreme Graphics of an Asus P5K-VM motherboard appears to cause the motherboard to refuse to POST.
Following on from yesterday's post about my home server's meltdown, let's just say that the day didn't exactly end well. At around 11pm, the Debian install onto the second PowerMac hung when setting up some component. It was time to give up for the night.
This afternoon I decided that I'd had enough of trying to build a central service from recycled components and gave Hard Drives NorthWest a call. I've had a number of good experiences with these folks in the past, so I thought I'd just customize one of their standard systems. I picked a Core 2 Duo based machine with 4GB of RAM and a 320GB drive. Not bad for just over $500.
Anyhow, on with the install.
9.22pm - Everything unboxed and plugged in. Ubuntu install disk at the ready. Boot!
10.31pm - The machine won't even post. Just a single flashing cursor on the screen. I tried reseating the RAM, draining the CMOS. All to no avail. I can't even get it into the BIOS. Back we go to Hard Drives Northwest in the morning.
Sigh. This shouldn't be this hard. It never has been before and I've built plenty of machines. Maybe it's just that now I've turned 40, technology hates me.
Today I came home to find that the linux box that provides all services to the machines around the house was wedged. I powercycled it, but it wouldn't come up - the drive had suffered some kind of fatal failure.
Now, this machine is pretty central to operations here at Casa Del Lacey. It provides NTP, DHCP and DNS services as well as acting as the Windows Primary Domain Controller via Samba. Handily everything else can limp along without it, but it's not pleasant.
Good job everything backs up nightly to the Infrant NAS.
Anyhow, for a while I've been thinking of replacing the hardware that the server runs on. It's a PowerMac G5 running Debian linux, but the fact that it's a PowerPC machine rather than an x86 machine has increasingly become a pain.
Handily I have a Dell PowerEdge SC1420 (64bit Xeon) sitting around that's been switched off and gathering dust for almost two years.
So, with the PowerMac's hardware meltdown and the fact that I've got a reasonably good x86 box sitting around, it must be time for an upgrade. I have an Ubuntu 7.10 x64 desktop install disk in my hand, the kids are in bed and it's time to start!
9:00 pm - Disk in drive, power up!
9:42 pm - The damn thing won't power up. Das blikenlights claim a power supply failure. Poo. Not to fear! I have yet another spare machine. Last time I tried it there were “some issues”. Yup. Issues are still there. Poo2.
Without persistent storage, it would be hard to persist the backed up data to S3 as you'd have to do the file to S3 mapping (I think) by yourself. With this feature, you effectively create a blob of S3 storage and use it as a drive in your EC2 machine instance.
Actually, that's how I thought it worked all along, but was weirded out earlier today when I found out that it didn't.
Previously, I've been backing up all the irreplaceable household data (movies, photos, etc…) to my colo'd server using Scott Ludwig's most excellent Link-Backup script.
Unfortunately I've been running out of space on that server and am thinking about decommissioning it, so as the first step of the decommission I need to move my offsite backups somewhere else. The obvious place appears to be Amazon's S3 service as the price of storage seems very cheap - $0.15 per gigabyte. There are also transfer charges, but in the general case it's upload only, so I'm ignoring that. I have about 100GB to store, which works out at $15 per month.
The next thing to figure out is “how do I perform the backup?” I initially thought that the easiest thing to do would be to continue to use Link-Backup. All I need to do is bring up and EC2 instance every time I need to backup and then use Link-Backup as I had been doing previously.
The big problem with this was that bringing up an instance is not trivial. Relatively simple, but not trivial. Also, the EC2 tools require Java 1.5 to run and it's not available on PPC Debian linux. Sigh.
I then dig a search around and found Jeremy Zawodny's excellent blog post on the subject which provides a nice roundup of the available solutions. I decided to go with the ruby based s3sync.
Set up was pretty simple. I first created an AWS account, collected the required data and configured s3sync. I then created some scripts as suggested by John Eberly and I was off to the races!
I ran some tests to make sure I could:
Backup a small directory that had some hierarchy in it.
Restore that directory.
Make some small changes to the local directory and make sure that only the changes were synced.
Pull back a single file using the included s3cmd script.
It all worked perfectly, so now I have ~70GB of photos syncing up to S3.
Add everything else (home movies, etc…) to the backup
The new Mac Pro features the latest Quad-Core Intel Xeon 5400 series processors based on state-of-the-art 45nm Intel Core microarchitecture running up to 3.2 GHz, each with 12MB of L2 cache per processor for breakthrough performance and power efficiency. With a new high-bandwidth hardware architecture, dual-independent 1600 MHz front side buses and up to 32GB of 800 MHz DDR2 ECC FB-DIMM memory, the new Mac Pro achieves a 61 percent increase in memory throughput.
Pretty cheap too, though it'll need a RAM (and maybe a video card) upgrade.
I realized recently that I do pretty much all my video and photo editing on my MacBook Pro these days, and it's a little slow. The desktop in my office is mainly for browsing and running iTunes. I'd like to be more productive with it - it's a nice machine, with lots of nice peripherals attached, but I think it might be time to finally switch it to the shiny newness.
As I posted a little while ago, I had finally had enough of the piece of technological rubbish that is the Comcast HD DVR box and purchased a TiVo Series 3 through Amazon. All that was left to do was actually have the two shiny cablecards installed by the “qualified” Comcast technician.
To be quite honest, I have no idea what they mean by “qualified” as he took a couple of hours to plug them in and sit on the phone chatting to a friend back at base while said person back at base kept pressing a button to send a signal to activate them.
My present and non-present technicians quite happily took ages to attempt this feat of technical brilliance and also failed to actually achieve it.
You see, it took my “qualified” guy eight cablecards to find two that actually worked and then left telling me that it was all working.
Well, at least it appeared to be working.
Later that day I figured out that you couldn't have both tuners set to channels 118 or above. One would work and the other would just show a black screen.
It actually took me a while to figure out which of the cards wasn't working correctly, but once I did I called up Comcast and spoke to a sharp lady who actually appeared to know what she was talking about. She had me pop the duff card out of the box and read her the physical serial number.
Two of the digits were transposed between the actual serial number and the number that they had in their database.
Hmm. So my original “qualified” technician, didn't really check anything at all. I wonder if those other six, duff cards were actually ok, that he just couldn't read and that he just gave up?
Anyhow, I thought that now Comcast had the right serial number they would now be able to send the correct activation signal and all would be well?
I had to take the card to the local retail store and exchange it for another one. I have no idea why and couldn't get anyone to explain it to me.
I did as I was bid, wrote down the serial number of the new card and called up Comcast again.
“You'll have to bear with me as this is the first time I've done a cablecard pairing.”
After thirty minutes I was told that I was all set and the new cablecard should be activated within ten minutes.
No joy after another thirty…
So I call up again and speak to another seemingly intelligent Comcast droid who looks at my account, says “Umm, the signal was never sent. Hang on a second.”
Bing! All channels received!
Thank you, mysterious Comcast technician…
And as for the actual TiVo itself? What more can I say, this thing is exceptional. Snappy and beautiful UI, gobs of storage (with the ability to add more via USB), lots of online features, the ability to move recorded shows around the net, etc…
And so far, the best thing is Amazon Unbox. Order TV shows and movies via the TiVo itself or via the web and have them show up almost immediately via the internet is a thing of beauty.
At the moment, the wife and I are heavily into Heroes (which we missed the first time around). All delivered via Amazon Unbox.
Back in 2000, I moved into Nabila's town home and was astounded by the tiny TV and less than basic cable. I know that A/V systems weren't high on the lovely lady's priorities, but something had to be done! Along with me, came by 32“ TV, so her 27” was relegated to the bedroom. Next up - what to do about the pitiful amount of available channels?
Public access TV can only yield so much amusement.
A couple of years prior I got on the local cable company's digital cable beta program. The company was TCI (then AT&T, then Comcast). The main reason for this was access to BBC America and the fact that it was “digital”. All that really meant was crappier picture quality.
Anyhow, I set up a time for an install for digital cable and crap loads of channels at Nabila's place.
They never showed.
They didn't show again.
I'm done. Time for some drastic action.
I'd heard good things about DirecTV, so I bought one of the basic Sony units, a dish and a drill. I then set about drilling holes in Nabila's house, mounting the dish and really enjoying the new picture quality.
A few months later I heard that TiVo had teamed up with DirecTV to produce a TiVo unit that recorded the stream direct from the satellite - no re-compression which had been the main thing keeping me away from TiVo. Anyhow, I picked one up and was hooked on TiVo for the next seven years.
Prior to our move to the new house earlier this year I had an HD TiVo and loved it and the DirecTV service apart from two niggles:
As an early adopter of the DirecTiVo and HD DirecTiVo, I was the last person to see any of the new features such as folders. Things like TiVoToGo and network access totally passed us by.
Locals in HD had to come in via an antenna.
When we moved to the new place, I decided to go back to cable and see what I was missing. The channel lineup looked good; they provide the boxes; you pay month on month - no contract. If it didn't work out, I could also go back to DirecTV.
What I missed was that the HD DVR box completely sucks!
Laggy response, bugs, crappy UI, shitty scheduling, etc… And that was before they downgraded the box from Microsoft's software to the even more shitty Comcast software.
I'd had enough. I want my TiVo back. Even Nabila green lit the idea!
Today my shiny new Series 3 TiVo arrived via UPS. I lovingly unpacked the box. I caressed the classic peanut remote. I fell in love with the unit. I'm going gooey inside with the thoughts of TiVoToGo, sleek UI and cozy nights in.
It feels like a jilted lover has returned home.
On Wednesday the cable guy arrives with two cablecards to consummate the relationship.
A few nights ago I decided to upgrade the Debian installation on my home server. I backup all the important stuff nightly, so I wasn't too worried. However, I did take a lot of care following along with the upgrade process and intervening when needed.
Now, this server is pretty important to my home network. It serves time, dhcp, nameservers and most importantly it runs samba to act as the Primary Domain Controller for my home Windows domain. All the windows boxes, as well as the Infrant NAS are clients in this domain.
Main Vista client still on domain? Check. Can log in? Check. Can browse the domain? Check.
Can mount the photo share from the NAS? Nope.
No logon servers available.
Now I'm worried. No matter what I do, I can't mount a share from the NAS on any Windows or Mac box.
Ok, over to linux and dig out smbclient.
After some groveling around I figure out that it's a problem with the NAS. Leave domain. Join domain. It all works, but I just can't see the shares. I turn to Google. “Samba 3.0.23 NTSTATUSNOLOGONSERVERS”. Nothing of use turns up.
I download the source to Samba 3.0.26a, the latest rev, build it and install it side-by-side. This way I can switch between the two revisions, so I decide to bring the domain up completely fresh in the new installation.
I can use “net getdomainsid” and “pdbedit -u user -v” to get all the old SIDs and RIDs, so hopefully everything will match in the new domain.
It all works perfectly - machines join the domain; users can login with the same SIDs (this is important so that all the old permissions still work).
Can I see the shares on the NAS?
Ok then. Lets downgrade the distribution's samba package to the previous version (3.0.14a).
Now we're screwed. It can't access the password database.
Ok, so the distribution's samba install is dead. I uninstall it and run with the source-built version in /usr/local/samba.
Now we're back to a state with fresh source and everything working except the NAS shares.
Bump up the debug level. Try connecting. Everything is working. No errors from Samba, but the NAS is still horked.
Dec 6 00:24:37 jeeves smbd 31622: [2006/12/06 00:24:37, 2] auth/auth.c:checkntlmpassword(309)
Dec 6 00:24:37 jeeves smbd 31622: checkntlmpassword: authentication for user [jpp] -> [jpp] -> [jpp] succeeded
Dec 6 00:24:37 jeeves smbd 31622: [2006/12/06 00:24:37, 2] auth/auth.c:checkntlmpassword(309)
Dec 6 00:24:37 jeeves smbd 31622: checkntlmpassword: authentication for user [jpp] -> [jpp] -> [jpp] succeeded
What changed and how to I get access to my data again?
Sounds familiar. What's the resolution?
After a lot of digging around I fixed it. During the upgrade of samba to .23c (now d) my PDC somehow “lost” the group mappings - this effectively left my network with no “Domain Admins” “Domain Users” or “Domain Guests” groups.
Re-creating those group mapping with the correct RID's fixed the issue.
Ok, there's a lead. “net group” shows no groups on my machine and “net groupmap” shows no NT to unix mappings. Could the NAS be looking for the “Domain Users” group (as it exports a share for each user) and as it can't find it, failing with a weird error?
After much documentation reading, I find out that this is actually by design! Prior to 18.104.22.168, those very important groups were generated automatically by Samba. With 22.214.171.124, they go away and it's up to the admin to create them!
Doesn't sound like a minor-minor-point release change to me.
Anyhow, I came up with the following:
# net groupmap add ntgroup=“Domain Admins” unixgroup=ntadmins rid=512 type=d
net groupmap add ntgroup=“Domain Users” unixgroup=users rid=513 type=d
net groupmap add ntgroup=“Domain Guests” unixgroup=nogroup rid=514 type=d
Can I connect to the NAS?
That was it.
Man did that suck.
What sucks even worse is the fact a minor-minor-point release upgrade horks backward compatibility in such an unpleasant fashion and, as far as I can tell, without calling attention to it in the upgrade process. The only relevant documentation I can find about this in the “Note” in the samba 'how to' docs
Versions of Samba-3 prior to 3.0.23 automatically create default group mapping for the Domain Admins, Domain Users and Domain Guests Windows groups, but do not map them to UNIX GIDs. This was a cause of administrative confusion and trouble. Commencing with Samba-3.0.23 this annomaly has been fixed - thus all Windows groups must now be manually and explicitly created and mapped to a valid UNIX GID by the Samba administrator.
This ain't exactly calling the problem to attention…
Anyhow, hopefully this rambling post will be of some help to others in the future.
Oh, and don't get me wrong - I love samba. I just feel a little jilted :-)
Should Apple reduce its price on any Apple-branded product within fourteen (14) calendar days of the date of purchase, you may request a refund of the difference between the price paid and the current selling price. An original purchase receipt is required, and you must request your refund within fourteen (14) calendar days of the price reduction.
Which makes me wonder whether the Apple store employee didn't know about the company's own policy, or whether she was purposefully steering me at another, less expensive for Apple, policy…
I know that I go on and on about how much I like the AppleTV, but an interesting thing just happened.
On Monday night I was browsing some video podcasts on the AppleTV in the living room and settled in to watch Robert Scoble's interview with Marc Canter (very interesting, btw). At about 30 minutes in, I was getting tired. Not caused by the content, I hasten to add - it was 1am.
So I paused the playback, switched off the TV and associated AV gear and went to bed.
Fast forward to this afternoon. I was heading to the airport for a trip to Mountain View and on the way out of my house, I picked up my video iPod from it's dock and slipped it into my pocket.
Once we passed 10,000 feet, I pulled out the iPod and navigated to the same podcast that I'd been watching on a different device the night before, many miles and feet of altitude away.
And it picked up where I'd left off.
Of course, I sorta thought instinctively that it would work. But to have it just do the right thing was priceless.
I'm in the process of moving house and one of the decisions I have to make is whether to stick with satellite, or move to cable.
Now, I've been incredibly happy with DirecTV, but I've had enough of using an antenna to get the locals in HD. At the moment I can't get NBC or PBS in HD due to issues with reception and I think that it's going to get worse in the new house.
Also, I'd like to move everything to Windows Media Center using CableCard - I think that's a way's off with DirecTV and it's available right now with cable. Excluding HD of course. Does anyone know what the timeline with Vista and HD CableCard access is right now?
Another question: the new house is wired for cat5 and coax. With cable, can I just distribute the incoming signal around the house and split where it's needed when a set-top or WMC box wants multiple physical inputs for multiple tuners?
Copy photos from camera to a temporary folder on my MacBook Pro.
Import photos from the tmp folder into organized directory structure in Adobe Lightroom, e.g. @/photos/2007/2007-05/2007-05-25-Zoo/RAW@ - renaming the files in the process as @2007-05-25-Zoo-001.jpg@, etc…
Cull and process photos in Lightroom.
Export the surviving photos as full resolution sRGB to, for example, @/photos/2007/2007-05/2007-05-25-Zoo/Processed@.
Upload the processed photographs to Flickr - tag, title, permissions and licenses as appropriate.
Copy the whole @/photos/2007/2007-05/2007-05-25-Zoo@ directory to my NAS.
Resync the NAS to an offsite server as a final backup.
Finally delete the photos from my camera once everything is backed up multiple ways.
I wanted to inform you that today is the most exciting day in the history of Infrant Technologies as we have agreed to merge our company with NETGEAR. We believe that our ReadyNAStm line of Network Attached Storage (NAS) appliances fit in perfectly with NETGEAR's established brand and will continue to evolve positively under their banner (www.netgear.com)
The vision we had when Infrant was formed is driving the revolution we see today. With ReadyNAStm, anyone can have terabytes of storage at home or in the office with the same features and performance that not too long ago was only available to the enterprise. Infrant set the standard of adopting SATA and Gigabit Ethernet when others were still using the 40-pin IDE cable and 10/100 connections. We pioneered the use of the diskless model to lower the acquisition cost of a high-capacity NAS. We provided a software feature set that was so complete, people felt compelled to use our ReadyNAStm not just in the home but purchasing a second unit for use in their office. And X-RAIDtm is still the only RAID that lets users add capacity as easily as putting in four little screws on a disk tray.
Congratulations to Infrant! A great company with a great product seals the deal.
And thanks to Steve Kennedy, who pointed me at them in the first place when I was in the market for a NAS solution.
Many of today's video game luminaries cut their teeth on Sinclair computers, among them Dave Perry, who runs Shiny Entertainment, and Tim and Chris Stamper, who founded Rare.
In 1967 Sir Cive Sinclair pioneered the miniature TV
“Sir Clive Sinclair gave so many British people an incredible step up into the videogame industry, which in a few more years will be bigger than the music industry,” said Mr Perry, who began writing games as a school child on the ZX81 and became a professional programmer thanks to the Spectrum.
“Clive is a national hero,” said Mr Dickinson.
“He loved looking for technology ideas and often had an idea and had to wait for the technology to catch up.”
I completely agree with this.
The fact that the three 3D graphics API outfits (RenderMorphics, Argonaut and Criterion) were based in London and that so many of the games companies in the eighties were from the UK (e.g. Rare) is completely due to the fact that as kids, we had access to a bunch of cool, low cost tech that other countries missed out on.
Last week I found iGTD, a native Mac application that appears to do everything right. An uncluttered yet functional UI; keyboard shortcuts for everything; integration with QuickSilver hierarchical projects; notes and everything you could wish for..
But most of all, it just works.
Highly recommended for you list-makers, GTD faithful and everyone who wants some sanity in their cluttered lives.
On Sunday during our regular “go do something fun” time, Julian and I headed to the local Apple store with the notion of picking up an Apple TV.
I'd been previously toying with the idea of using Windows Media Center everywhere in the house for video, photos and music playback, but it's become increasingly apparent that the only way of doing this is to go and by more Xbox 360s for each location purely for the Media Center Extender functionality.
And there's no way in hell I'm doing that.
So, thought I, what about that new Apple TV thing? It's only $299 and basically acts like a Video iPod with a TV attached.
How wrong I was. It's way more than that.
So this evening I unboxed the incredibly sleek and sexy unit, plugged it into the TV, switched it on, typed in the wireless password (it's got built-in wireless), typed in the code it gave me to a copy of iTunes running on my PC, and viola! Instant gratification.
It just works.
Music, playlists, photos, podcasts and movies all playing on my main TV. It seems like as soon as the unit has synced the catalog of content, it'll stream the actual content across the wireless network while it's actually syncing the content, so you get to play very quickly - a great design decision as I'd have hated to wait for the sync to it's local disk to happen before I could muck around with it.
And did I mention video podcasts?
I subscribed to a bunch of them and they were available on the Apple TV very fast - I was watching Diggnation before you could say “Digg this”.
Plus, you get movie previews, etc… streaming from the internets.
Oh, and the UI is as sexy as all hell, in gorgeous HD 1080i wonderment.
In summary, if you use iTunes and have a TV, you must run, don't walk, to your nearest Apple store and buy one now.
I gave up two years of my life writing about gadgets for this site. Waking up every morning at 5 AM, chewing up press releases to find the rare morsel of legitimate information, chasing down “hot tips” that ended up being photochops of iPods with reflections of genitals in the touchscreens. Oh, and the worst: fielding emails from PR parasites eager to suck away precious time in a half-hour phone meeting while the Senior Vice-President of Smoke Blowing tells me about how his company's software—based on an idea cribbed from Google-is going to change the way I look at something I didn't care about in the first place. (Inevitably, “forever.”)
Wonderful stuff. The comments aren't bad either:
I heard the sound of dozens of sphincter's tightening all the way in Canada.
The HP Photosmart C6180 is a nice printer/scanner/fax combo. It's also small, works wirelessly and is fast.
There were no supplied drivers for Vista, but pretending that it's an HP Photosmart 7180 worked a treat1. The supplied software for the the Mac worked great too.
Now, on one of our XP boxes I installed the drivers from the supplied CD and even after telling it not to install the usual crapware that comes along with such supplied CDs, the installation took forever! I mean, it felt like it took longer than Flight Simulator to install!
Anyhow, it's nice to have a new, modern, networked printer at Casa Del Lacey - it's replacing our aging Lexmark Z52. Not having to string any wires other than a power cable was a nice treat and having it just DHCP for configuration over wireless (after having set up the WPA goop) worked flawlessly. Well, it did after I snooped it's MAC address and added it to the DHCP and BIND configurations on the Linux server - users without an over the top geek habit will have no such hassle.
It also has a bunch of media card reader slots, so as a test I popped in an SD card from a point'n'shoot that I have hanging around whereupon it displayed the images from the card on it's little LCD. I selected one, inserted a 4X6 photo paper (after hunting around for the mini-tray that was hidden under the output slot), hit print and voila! After a minute, a pretty good photo popped out.
Anyhow, it also supports bluetooth for some reason and after connecting a phone line, the fax part of the equation configured itself.
In summary, this is a nice little printer and at just over $200, it's recommended.
1 A generally workable trick - just guess at a compatible printer. ↩
Today we had FIOS installed at Casa Del Lacey - fiber into the home.
Verizon put the fiber up on the poles a while ago, but only recently begun actually wiring up homes - I've been checking every week to see when I could get it.
Well, today was my lucky day when three guys in three trucks turned up to perform the install.
The installation basically consists of installing a fiber termination point in your home along with a battery backup. Your phone lines are switched to fiber at the same point, and the battery should give four hours of standby for the fiber driver - there be lasers in that there box…
Before entering a building, troops squirt the plastic goo, which can shoot strands about 10 to 12 feet, across the room. If it falls to the ground, no trip wires. If it hangs in the air, they know they have a problem. The wires are otherwise nearly invisible.
In other cases of battlefield improvisation in Iraq, U.S. soldiers have bolted scrap metal to Humvees in what has come to be known as “Hillybilly Armor.” Medics use tampons to plug bullet holes in the wounded until they can be patched up.
Also, soldiers put condoms and rubber bands around their rifle muzzles to keep out sand. And troops have welded old bulletproof windshields to the tops of Humvees to give gunners extra protection. They have dubbed it “Pope's glass” - a reference to the barriers that protect the pontiff.
There's been an odd rush by governments to move to RFID passports, even though there are serious concerns about how secure they really are. Over in the UK, where many RFID passports are already in use, a security researcher and a reporter were able to crack some aspects of the passport. It is, admittedly, a limited crack, but it could potentially be used to make a clone RFID chip for a counterfeit passport.
With apologies to Karen for two geeky posts in a row, I present “What I did last night”. I promise I'll post a picture of Julian in the next post.
As I alluded to in my previous post, I recently decided to scratch a nagging itch and setup a Windows Domain at Casa Del Lacey. I've been wanting to do this for a while, but it's recently come to a head with two Windows desktops, a linux box, Mac laptop, Windows laptop, network storage box and the Xbox 360.
Having separate accounts and associated passwords plus having to setup the desktop “just how I like it” when moving to a new machine, browser bookmarks being different, etc… was just becoming a pain. I was also relying on my wireless router to provide DNS services (which was flaky).
Something had to be done.
Initially I thought about heading over to Best Buy and picking up the cheapest machine capable of running Microsoft Small Business Server. I'd had such a great experience with it at SwitchGear that it seemed like the logical choice, but it's expensive, even at the company store.
But then I thought “I bet those clever Open Source folks have figured this all out”.
Install a DHCP server, and assign IP addresses to all the machines on the network.
Install the BIND9 name server, and have it serve up DNS locally for one of my domains, creating a 'home.judesoftware.com' DNS domain in the process.
Now, at this point I could have configured the magic whereby the DHCP server assigns addresses dynamically and updates the DNS server in the process, but that would have required setting up keys and trust, etc… between the two services, and I didn't have that much patience.
Install samba and set it up as a primary domain controller.
This part was fairly easy apart from one gotcha that I'll get to later. Basically I'm running pretty much with defaults, the trick is getting the clients set up. The following is my configuration file (/etc/samba/smb.conf) with a few modifications to protect the innocent:
workgroup = THIS_DOMAIN
netbios name = THIS_SERVERS_NETBIOS_NAME
passdb backend = tdbsam
printcap name = cups
add user script = /usr/sbin/useradd -m %u
delete user script = /usr/sbin/userdel -r %u
add group script = /usr/sbin/groupadd %g
delete group script = /usr/sbin/groupdel %g
add user to group script = /usr/sbin/groupmod -A %u %g
delete user from group script = /usr/sbin/groupmod -R %u %g
add machine script = /usr/sbin/useradd -s /bin/false -d /var/lib/nobody %u
# Note: The following specifies the default logon script.
# Per user logon scripts can be specified in the user account using pdbedit
#logon script = scripts\logon.bat
# This sets the default profile path. Set per user paths with pdbedit
logon path = \\%L\profiles\%U
#logon path =
logon drive = H:
logon home = \\%L\home\%U
#logon home =
domain logons = Yes
os level = 35
preferred master = Yes
domain master = Yes
idmap uid = 15000-20000
idmap gid = 15000-20000
comment = Home Directories
valid users = %S
read only = No
browseable = No
comment = Network Logon Service
path = /var/lib/samba/netlogon
admin users = root
guest ok = No
browseable = No
# For profiles to work, create a user directory under the path
# shown. i.e., mkdir -p /var/lib/samba/profiles/steve
comment = Roaming Profile Share
path = /var/lib/samba/profiles
read only = No
profile acls = Yes
This is a pretty standard setup and it gets you some cool features:
A domain named THIS_DOMAIN. Create user and machine accounts and the world is wonderful.
Each user's home directory on the Linux box is magically available as H: on their Windows box when they log in.
Roaming profiles. More on this later.
So, setting up a new user on the domain is pretty easy.
root# /usr/sbin/useradd -g users -d /home/sjl -s /bin/bash -c "Steve Lacey" steve
root# /usr/bin/smbpasswd -a steve
Pretty simple, huh? The first command creates a new unix account (you can skip this step if the users already have accounts) and the second command adds the user to samba's domain users.
You'll also need to make sure that you add a samba account for root as by default he's the domain administrator.
Next up, adding machine accounts. Except you don't need to. Just go to the Windows machine, and from the system control panel applet join the machine to the domain - you'll need to enter the domain account and password for THIS_DOMAIN\root that you created in the previous paragraph. All is good, just reboot the Windows box and log in to the domain!
And now the problem that I encountered.
Ahhh, roaming profiles. These are a wonderful thing. They enable your settings (desktop themes, start menu choices, browser bookmarks, etc…) to be cached on the server so that when you move from machine to machine your experience is exactly the same. It's a wonderful thing to behold (and interestingly wasn't enabled at Microsoft when I was there).
For me, a problem occurred because I didn't actually create the per user directory where the profile is stored. It's the only part of the process that isn't automated, which means that I didn't do it.
When I logged into the domain from a Windows box for the first time, Windows told me that it couldn't find the profile and was giving me a temporary one.
Ooops, I thought. I figured out what was wrong - I needed to create /var/lib/samba/profiles/steve and chown steve.users it.
So I did that, but Windows was stuck on the temporary roaming profile - no amount of restarting and rebooting either box could fix it.
The only way I could resolve the issue was by having the Windows box leave the domain, delete the machine account from samba, delete the normal unix account for it and then rejoin the Windows box to the domain.
For reference, the machine accounts are machinename$, but you can skip the trailing '$' when talking to samba:
Following up from a previous post, I went ahead and ordered an InfrantReadyNAS+. This is a [N]etwork [A]ttached [S]torage device that sits on your network and acts purely as a fileserver (well, this box can do a bunch of other stuff too, like stream audio to other network audio devices as well as iTunes clients).
I ordered the box empty of drives (but with 1GB of RAM), and ordered four 750GB enterprise class Seagate Barracuda drives from NewEgg. The drives arrived earlier this week, and this afternoon I picked up the Infrant box from the local FedEx depot.
Late this evening was setup time.
First impressions are wow! It is incredibly small - just big enough to house the drives and as solid as a rock. The build quality reminds me of the PowerMac.
Setup was extremely easy. I just followed the instructions and installed the drives, connected it to the network and then booted it. By the way, like all good hardware manufacturers, Infrant not only included all the mounting screws needed for four drives, but also included extras.
The unit comes with some software for PC, Mac and Linux, but you don't really need it. If you know the IP address of the box you can just navigate to http://[host]/admin in any browser to configure it. Before booting the unit I had added the box to my local DHCP and BIND servers as, handily, the MAC address for the unit is printed on the bottom. Anyhow, the IP address, however obtained, is displayed on to LCD at the front of the unit, so you can just use that.
The box supports domain membership, so I just joined it to my home domain, setup some shares and that was it. By default it creates a user share for each user in the domain which is nice. It supports sharing by CIFS , NFS, AFP, FTP, HTTP, HTTPS and RSYNC, and also supports discovery via Bonjour.
Oh, did I say home domain Yup, I out geeked my self last weekend and setup a Windows domain in my house as there are now too many machines and running without “single sign-on” was becoming a pain. Did I mention that my Primary Domain Controller is Debian Linux running on a PowerMac G5 More on that in a later post…
Anyhow, the Infrant NAS provides a whole heap of other good features, but lets just say that the best one is that I now have a RAID 5 box on my network with over two terabytes of free space.
So I'm a devout user of NetNewsWire as my RSS aggregator of choice on the Mac. It's pretty damn near perfect but, of course, it's tied to your local machine. The subscription/read/unread sync to Newsgator is cool, but that's only really helped me when moving to a new machine.
Robert Scoble popped over to our office in Kirkland today for a video interview and a tour of the Google facilities. I followed around chatting with Buzz Bruggeman who came along for the ride, while Robert wandered, walked backwards and did the fun Scoble interview thing. He even interviewed the Chef!
It sure was nice catching up with him and it sounds like he's enjoying California and PodTech immensely.
His HD video camera was quite a feast for the eyes - Robert, what type is it?
He also gets the honour of causing the phone on my desk to ring for the first time, when he called from reception. I've had it over a week now and it never rings - 'tis all email… Dunno why I have the thing…
…the Sony HDR-SR1 camcorder that records in High-Definition, 1080i quality on a built-in 30GB hard drive (which loosely translates to around 2 hours of recording in highest quality). This should work out of the box with Windows (Vista) Movie Maker which has support for HDV editing (yes, my Mac does that too thanks).
Of course, that 30GB for two hours of video has to be stored somewhere and that leads me to the second bee in my bonet.
The storage systems at Casa del Lacey are strained to the max. That coupled with backup mayhem has had me thinking recently about just biting the bullet and punting the problem to a dedicated storage box.
When you install IE7 for the first time it gives you the option to select the search provider for IE. The available search engines are listed alphabetically which means that Microsoft's own search service is listed below Google.
That is some style.
Next question: When I visit my own website from IE7 does it show the toobar thingy and say “This website wants to run the ”Windows Media 6.4 Player Shim“ from Microsoft”? That makes my site sound dodgy, and I'm pretty sure it's not…
What most people don't realize is that in the early days on Direct3D many of the 3D accelerators (or sometimes “decelerators”1) where in fact very general purpose DSPs. At that point in time (mid-1995) many of the hardware guys were very surprised at the rapid mindshare growth of consumer 3D hardware. We're talking people like Evans and Sutherland, SGI, etc…
Regarding Direct3D and the following OpenGL v. Direct3D wars, one point I feel is worth making is that if it were not for Direct3D, we would not have the gaming platforms that we have today.
In 1995, 3D was stalled, noone was innovating and OpenGL was stagnant. It was only after the release of Direct3D that OpenGL started to make any headway.
In fact, at lunch during the hardware guys' day at Aftermath event, we put a pre-release PS1 on the main video screen running Tekken in demo mode. You could see the palpable fear in most of the hardware guys' faces. “What is that?” “That's not realtime.” These guys, with very few exceptions, were missing the boat.
End of interlude, now where was I?
Many hardware startups in the Valley at that time were putting together DSP based ISA (or even the new PCI) boards to offload modem, audio and other functionality from the overwhelmed CPU.
These guys recognized a good thing when they saw it and decided that they could add 3D graphics too. It's all just vector math isn't it?
The Chromatic Mpact part was a case in point. It was a modem, an audio card and a 3D accelerator all rolled into one! Step right up! Well, until you tried to run a few thousand triangles per second through it. Mind you, they had some very cool booths at the shows.
None of the parts of this generation performed triangle setup (apart from the 3DFX Voodoo 1) and they all sucked to one degree or another.
But a special place in my heart is taken by the Rendition Veritee 1000.
Rendition was a true entrepreneureal company built by engineers. Those guys were great. They were the only guys to have hardware ready for Comdex 1995 where we (or rather Ty Graham, our hardware evangelist) showed hardware-accelerated Direct3D for the first time.
Servan (one of the founders of RenderMorphics) and I had locked ourselves away in an office on campus after the Aftermath event and we had a week until Comdex. We finished the driver model, an API that could drive it and had the famous2 “tunnel” sample running on it.
But for a driver model we needed a driver and some hardware. Here was where Rendition really stepped up. One of the guys from Rendition came up to Redmond and basically lived with us. He was writing driver code while we were writing the driver model. Talk about bleeding edge.
The prototype, hot off the chip foundry, V1000 was mounted on a red prototype board with a fan glued to it. Unfortunately, when plugged into the bus on the machines we had, the board was upside down and every hour or so the glue would melt and the fan would fall off. Much hilarity ensued. Much pizza was eaten.
At four in the morning on the first day of Comdex, Ty walked into the office, already late for his flight to Vegas, and witnessed tunnel running at lightning frame rates. We packed up the dev machine in a flight case and off he went - by all accounts everyone was awed by this cheap piece of consumer 3D hardware.
I still have that board.
But that's not the point of this narrative.
The point is that the V1000 was a DSP based part that stored it's microcode in it's onboard memory in an address space just below VGA memory.
Which meant that any bug in the driver, especially the clipping code, caused Direct3D to render triangle fragments all over the microcode. It didn't just blue screen. Can you say hard lockup, push the big red reset button?
Hugh is a brilliant cartoonist and marketing guru and was asking if anyone wanted to build a widget for blogs that could display the latest cartoon whenever someone visited.
Well, I'd recently been feeling the urge to try my hand at building a dashboard widget for OSX, so I thought I'd ignore the requested target platform (others have been solving that problem) and try something a little different by building a dashboard widget for gapingvoid.
The widget pulls the rss feed from gapingvoid whenever it is shown at a maximum update rate of once per hour - there's a button to force a refresh.
Let me know what you think.
Update: For those of you that haven't installed a dashboard widget before, here are Apple's standard instructions:
Mac OS X 10.4 Tiger is required. If you're using Safari, click the download link. When the widget download is complete, show Dashboard, click the Plus sign to display the Widget Bar and click the widget's icon in the Widget Bar to open it.
If you're using a browser other than Safari, click the download link. When the widget download is complete, unarchive it and place it in /Library/Widgets/ in your home folder. Show Dashboard, click the Plus sign to display the Widget Bar and click the widget's icon in the Widget Bar to open it.
Basically if you're using safari, you're set. Otherwise just follow Apple's instructions or you can just unzip the archive and double-click the resulting widget in a Finder window.
Update 5/10/06 Version 1.1 - Minor bug fix to the widget that addresses a couple of bugs regarding how it figures out what is the latest cartoon.
I like them, but they're playing a little loosely with the truth.
For example, in Viruses they imply that Mac's don't get viruses, which is obviously not true. Right now they just have a lot less. I wonder how many people are going to start complaining when the rate of infection increases.
And the ad about viruses is just plain STUPID. Man are they asking for it. What happens when users who bought Macs thinking they couldn't get viruses all of a sudden are getting them. The Federal Trade Commission is going to love that. Can you spell Class Action Lawsuit?
Yes, when you buy a Mac, you get iLife. But when iLife gets upgraded, you've got to buy the new version if you want the it. Even if you buy the new version of the OS (i.e. when upgrading to Tiger from Panther), you don't get the latest version of iLife with it.
I.e. one hardware purchase == one version of OSX == one version of iLife.
Up until now, I've been editing my posts online using the MovableType interface.
This generally sucks, due to the lack of spell checking and everything that an offline editor can give you.
On the other hand, I'm not much of a wysiwyg guy as far as editing text goes. Give me troff any day. So I use the wonderful Textile 2 plugin from Brad Choate - a great and simple markup language.
Every now and then though, I try out the 'latest and greatest' offline editor, all of which generally suck either by forcing the source of the post to be HTML (and doing a very bad job of it), or having very little functionality.
Now though, I have a MacBook sitting in front of me, so I thought I'd give MarsEdit a go.
Am I ever glad I did!
Other than the great UI, I clicked on the preview button and saw that it offered markup using Textile. Hmmm, I wonder if it can do Textile 2?
A quick web search later yielded this post by Jon Hicks, and now I am able to create posts and preview them offline.
I can even have the live preview running as I type.
Nice job guys, and great job to Newgator for building a stable of great blogging and blog reading tools.
I just posted a comment over on the G'Day World Blog, and thought you guys might find it interesting…
I picked up a MacBook Pro because of the bootcamp announcement. I was debating what laptop to get for th new business, and wanted a Mac, but like you was thinking that it might be a waste if OSX just didn't do the job for me - I'm still a Windows nut at heart.
Anyhow, bootcamp works great - am doing dev work, runing the usual apps and GuildWars runs perfectly :-)
I'm also using the Parallels workstation beta when booted into OSX to run Outlook and MindManager.
Anyhow - great laptop - with two caveats.
No second mouse button - you gotta ctrl-click in Parallels and install a hack for bootcamp to get one.
No delete key. Yes it says “delete” on the keyboard, but Windows sees it as backspace. That makes doing ctrl-alt-del a problem when logging in! Parallels has a menu option to send a ctrl-alt-del, but bootcamp has no option. You need to plug in a usb keyboard to get logged in, then use a key remapper like the one in the Windows resource kit to map another key to delete. I just remapped F12.
Anyhow, no regrets. The machine is ultra-sexy. It's all worth it for the backlit keyboard :-)
I've been having an internal argument with myself over what laptop to get, and after using WinTel based laptops since my first one in 1993, I decided that it was time to try the “other” operating system on my laptop for a while.
You may remember that I attempted this last year, but I've got over my previous objections to the platform after switching backwards and forwards between a PowerMac G5 and a Windows PC at home.
For Outlook I can use Entourage, for MindManager, well, I don't know. Virtual PC might have to suffice (but an Intel OSX, running a PowerPC VirtualPC, running an Intel Windows XP might be interesting…). For FeedDemon, I can use NetNewsWire.
My problem has been, what if I just doesn't work for me? Have a just wasted some cash and some really nice hardware?
Well, now I can just boot Windows XP on the MacBook. I wonder if anyone has tried a Vista beta on it?
Thinking about it, you can call an Intel based Mac running Windows a “WinTel” box too!
As I seem to be permanently on a quest to find the perfect RSS reader (well, for me anyway), I thought I'd give it a try. I wasn't holding out too much hope as the 1.x series of the product didn't really agree with me.
I am very happy to report that FeedDemon 2.0 completely blew me away.
The UI is nice and responsive, the tree view is nice (though I really wish it would support sub-folders).
I especially like the “River of News”/Newspaper implementation, and all it all it just looks and feels nice and polished.
But for me, the real wow moment happened when I selected a feed from Flickr. Above the content appeared a thumbnail pane that contained all the pictures. Clicking on them takes you to the specific Flickr photo page.
Oh, and did I mention the feature that keeps track of (and presents to you) information about which feeds you're paying the most (and least) attention to? Mind you, I'm not sure if this is a real Attention implementation. I can't seem to find an attention.xml file anywhere…
Very, very cool. Newsgator has a winner on their hands.
The first hall I wandered into from the automotive hall. Tons of bling for your vehicle. The cool thing was Kevin Bacon and his band 'The Bacon Brothers' playing at the XM Radio booth.
They completely rocked! I think I'll be picking up a CD.
The rest of the hall was devoted to in-car entertainment gear. Check out this car tricked out with a waterfall!
Then it was over to a different hall that held all the big players, including Microsoft, Intel, Samsung, etc…
Over at the Microsoft booth I bumped into a friend of mine, and fellow Microsoft blogger, Kim Pallister
Gibson Guitars had a big tent at the show. This was pretty cool - they were showing off the new wares which included this DDR style game that was played using a guitar-like controller. I particularly liked this screenshot:
It reads “They don't really want you to play 'Freebird.' They're just heckling you.” Hehe.
Anyhow, that's it for now. Check out the Flickr group for more.
All the kids who did great in high school writing pong games in BASIC for their Apple II would get to college, take CompSci 101, a data structures course, and when they hit the pointers business their brains would just totally explode, and the next thing you knew, they were majoring in Political Science because law school seemed like a better idea.
…I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days, and create better Java code than people with five years of experience in Java, but try explaining that to the average HR drone.
The point that Joel is making is key: that CompSci degrees are supposed to be about teaching how to program, not how to program in one particular language.
Software development is an art form, a skill. It is not a rote-taught methodology.
On Monday, the HDMI output from my HD DirecTivo unit failed in an unusal manner. Basically the picture turned completely oversaturated and “cartoony” in appearance.
Initially I thought that Julian had got ahold of the TV remote control and messed with the settings, but after searching the house and finding the remote (on my TV, there is no way of messing with the settings other than with the actual remote), I figured out that the output had actually failed.
When I first acquired the unit I was using the component outputs as I had no display with an HDMI input. I was a tad concerned as I had read numerous reports of the HDMI port being DOA. Fortunately, when I hooked it up to the plasma back in March, it worked flawlessly. This week it appears that I hit the second most common problem out there.
So I called DirecTV with the expectation of getting the complete run-around. I fully expected this to be the forcing function that would move me back to cable and onto Windows Media Center.
Flawless customer support and “We'll have a new unit sent out to you tomorrow”.
So I thought for a change that I'd post about good customer support.
For the past eight months, I've been using my own home grown aggregator Katana. I wrote Katana as an experiment in C#/.Net programming and because nothing else out there did everything I wanted an aggregator to do.
Now, RSS Bandit has come forward in leaps and bounds. My favourite features?
The “River of News”/Newspaper view when clicking on a feed rather than the an article.
Great feed discovery.
Performance. This release is way faster than previous releases.
Support for folders. This is a key one that many other aggregators miss.
Style. Somewhere along the line, RSS Bandit got sexy…
If there's one thing it's missing, it's the ability for comments to appear in the article window, inline with the article itself. I really like this feature in Katana. I'd also like some form of visual notification (article/feed in italics for example), that there are new comments available.
Other than that, I think I'm hooked. The “River Of News” view has accelerated my blog consumption.
I don't have enough time to keep working on Katana (though I'll still be using it for Podcast downloads), so, regretfully, I'm freezing it.
What with the 1000 DVD changer (mind you, I'd rather rip them all to a terrabyte server...) and the fact that DirecTV seems like it's going to be dumping TiVo like a lead ballon, I really want to switch to Media Center. Not least the fact that I want all the TVs in my house to have access to the same media repository - the same recorded TV shows and the same audio library, plus all the home videos.
My only blocking point is HDTV. I need HBO, Showtime and Discovery in a recordable HD fashion. DirecTV gives that to me now via my HDTiVo.
Lego's customization application "Lego Digital Designer" (interestingly built by some ex-Direct3D/RenderMorphics friends of mine) has been engineered (not sure if I'd call it hacked) for some unintended purposes.
Following on from the release of GoTo, my ajax powered home page, I've just released a bunch of updates, including the most requested feature - support for adding a page to GoTo via a button link when at that page. You can now also move links around between blocks, just like you could move blocks around.
By dragging the link "Add to GoTo" to your browser links area in Firefox, or right-clicking the link in Internet Explorer and adding it to your favourites, you can automatically add the current web site to your GoTo subscriptions. They will be added to a new block under the "Top Links" section, and you can drag the items to your normal blocks from there.
You can now delete a block by clicking on the "Delete block" link.
If using Internet Explorer, you can now hit "Enter" in edit boxes to complete the operation.
Blocks and items are now sorted by creation time.
Fixed issues regarding slashes being displayed after text has been escaped for storage.
Editing an item now selects the contents rather than just giving focus.
Fixed bug in ranking (links linked to by the same user multiple times now only counted once).
Now supports all URI schemes, not just http.
Added the ability to reset your password.
Deletion of blocks now requires confirmation.
You can move items around between blocks by clicking on the "#" icon next to the item
Over the past couple of weeks I've been mucking around on and off with a website idea. You can try it out on my "other" site over here.
The basic idea is that the site acts as a very fast homepage. You create blocks that you can move around, and in them you place links. It's all in an AJAX style using XMLHttpRequest stuff, so there's no page reload - everything is edited in-page and hopefully everything happens very quickly.
So there you go. If you give it a try, please drop me a line with your experiences.
I've got a bunch of other features in the pipeline for it, and I hope to keep everything stable.
The remote is a universal remote with a great feel. It's a peanut-style remote, similar in form to the TiVo remote, with a very vibrant color LCD screen. Setup is very easy - after changing you just plug it in to your computer via USB and then run the provided software which takes you off to the logitech website. Here you enter model numbers for all your components and setup "activites" - watching TV, playing a DVD, etc... The defaults all worked for me. Data then gets downloaded to the remote and off you go.
Setup went pretty smoothly for me, with the remote controlling every component almost flawlessly. The only problem I had was that it put my Yamaha reciever into "6CH Input" mode. So I clicked "help" on the remote and it said "Is the TV On (yes/no), Is the TivO on? (yes/no)". Eventually it got to asking about the receiver and finally said "Make sure 6CH in unlocked". I went over to the receiver, pressed the appropriate button, and everything is fine.
This kind of help system is incredible, especially for the times when you pointed the remote away while it was sending commands to multiple components and one is now out of sync. It'll go through a few questions and take the appropriate actions.
You can also upload bitmaps to it for favourite channels. Some good ones are over here.
For other reviews, there's another good one here on Home Theater Forum.
Overall, highly recommended - even with the hefty $250 price tag.
I'm sat in Starbucks (of course) inking this entry using my new tablet, the Tecra M4. I have to say, the character recognition is just astounding, it seems about 99-1. accurate - even with my awful chicken scratchings.
The 1400x1050 resolution is just great, and I'm now on a mission to find all the useful tablet utilities our there. I've been a big user of Password Safe for a while, and I've just discovered that it also makes for a very useful tablet tool. Because it acts as a wallet for passwords, and it never wants to actually display them, you just double click on the entry and it copies the password to the clipboard. Very useful for the tablet...
Via Scoble, I came across this post by Rick Segal. Yup, things were sure different ten years ago. I was in the DirectX team at that point. How many people remember the Judgment Day event for the release of the Games SDK (DirectX 1), where we had Bill playing the part of the protagonist in Doom, the haunted house ride where various rooms were put together by different partners - including a room designed by Gwar...
What about the Pax Romana event where we had game developers dress up in togas, served them meat on huge platters and had live lions in attendance, and the party hosted by DirectX evangelism in New Orleans for Siggraph '97? Now that was interesting...
And yes, the brainwash still exists. I remember it tasting awful, but I still have a few bottles kicking around. I wonder how ten years of aging have treated it.
Interestingly, ten years later at this year's Meltdown, I'm going to be giving a presentation. I've not given a public presentation since leaving the DirectX team in 1998, so this should be fun.
Mind you, they're a lot more organized these days. They want the presentations three weeks in advance. Back in the day, we would still be writing them up to the point we hit the stage...
Caveat: This is a bit nit-picky, but important nonetheless.
I've had a few people ask me how to listen to my podcasts in iTunes - when subscribed, they get the latest one, but they can't see how to get any more.
Here's the problem: iTunes on Windows looks like a Mac application, and some of the standard Mac widgets aren't recognized by Windows users.
Once you've subscribed to a podcast, you can go to the podcasts section in iTunes and see the ones that you've subscribed to. The problem is that, for a Windows user, it's not obvious that you have to click the grey triangle to expand it, or that it's expandable at all.
Discoverability and following standard concepts for the platform are features of good UI design. Windows users are used to seeing the plus and minus symbols for expandable lists, not triangle thingies that have no obvious meaning.
A good friend of mine and Microsoft guy is leaving the collective to follow his dream. That of becoming a game developer. The ZMan currently works as an enterprise software developer but has recently been working on increasing his skill set in the games development space - as evidenced by his Managed DirectX website and by writing articles for MSDN.
Now he's leaving the company to take a year off and develop his skills.
That takes some gonads.
My parents tell a tale of the child who said he would be a millionaire by the time he was 18 if they bought him a computer so he could write video games. Well I got the computer but the million never happened.
Now 20 years later its time to try again. My last day of full time employment as an enterprise application developer with a salary, health and retirement benefits will be July 8th. Shortly afterwards my new job as self employed game developer, living off the savings and paying for everything myself will begin and you can read all about it here.
Man has the blogosphere gone quiet. I think everyone I'm subscribed to was at Gnomedex and now we're all knackered. Considering the multiple T1s needed to keep the wifi network at the conference fed, I think the entire internet infrastructure in the Pacific Northwest is also breathing a sigh of relief. I'd love to see a map of the traffic usage over the net for the past few days...
Synchronization of subscriptions became a hot topic during Steve Gillmor, Dave Siffy and Scott Gatz's panel. The feeling was that someone needs to solve this. A get/put for opml would be great, but the rest of the meta-data is needed also, i.e. read/unread item information. Much resistance to any form of roach motel effect.
These are just my notes. Expect bad grammar and unfinished sentences. I'm sure the Channel 9 video explains this better... I haven't seen it yet.
Subscribe is big. We went browse -> search -> subscribe! There's a balance.
Longhorn will enable RSS for developers and users at a deep level.
First public showing of IE7. Everytime the browser encounters a feed a button appears - renders RSS, provides subscription model.
Subscriptions placed in common feed list for the user. An application can use this. Alluded that there'll be a common api to access feed content, be it rss, atom, etc...
An enclosure is an item in a feed of content.
Demo of subscribing to a feed of ical items. An application used the platform to get that data and populate outlook. The was a bunch of people getting hot under the collar about "why would you do that with calendar items, this way is better, etc..." I think they missed the point. The point (I think) was that an application was using the platform to ask if there were feeds containing particular item types, and then doing something with it. The platform is dealing with (and being a central repository for) subscriptions, and performing the download, space management, etc....
Lists are different to feeds. They are extensions to RSS that identify a feed as a list and describe the content so that consumers of the feed can do rich things with it. The specification for the RSS extensions will be available under a Creative Commons license. Details on the IE Blog.
So far I've met a bunch of interesting people including Steve Mays, Leo Notenboom, and others. I also had an interesting conversation with a guy from Nickelodeon of all companies - the media is definitely waking up.
One interesting event was the fire alarm going off and the building being evacuated...
I'll let the masses comment, but an interesting tidbit is the developer platform: a 3.64 GHz P4. This seems to imply that there will be nothing out of the ordinary with the actual Intel based Apple hardware (i.e. they're not going to be using XScale chips or other non-x86 chip). Jobs talks about moving applications to be mostly just a recompile - for CPU intensive apps I think that's under-estimating the endian-ness issues a bit.
Overall, this looks like a nice move for Apple and Intel.
So I've been trying the Powerbook out on and off for the past week now, and though I absolutely love the hardware, I can't deal with the poor text rendering in OSX.
I bought the machine for personal use to try out the other OS. (Yeah, yeah, there's Linux too, but that runs nicely in Virtual PC...)
But no matter how cool the UI is (and it is cool, not sure how it'll stack up against Longhorn though), I can't get past the text rendering. For me, it's just too blurry. And yes, I've scoured the web for solutions and tried them all. It just doesn't stack up to ClearType, no matter what people say. And I don't buy the "it's the way anti-aliased text looks in print" argument. For a start, it's not print, it's an LCD panel, and second, I don't care. Text in Windows just looks so much better. All of course in my humble opinion...
So it's going back tomorrow.
I'd been thinking all week "why can't PC manufacturers come up with a Windows laptop design as cool as the Powerbook?" The Powerbook design has everything. It's sleek, widescreen, all the I/O ports you could want. Bluetooth, wireless. Cool, geeky lights on the power connector port and latch - not to mention the very cool backlit keyboard.
This weekend has been a busy one, first Nabila disappears with our neighbour Anna and reappears four hours later with an early Father's Day present - a new Webber gas grill. Cool, I've been wanting one for ages, and I've been grilling all weekend. I don't know how we've managed to survive in the States for eight years without one... and then the new patio set arrives.
We are now set on the summer entertainment front thanks to my lovely wife.
The other thing I did this weekend was to pick up a Powerbook. I haven't used a Mac in ages and being an ex-Unix head have wanted to try out OSX.
I also wanted to see what all the fuss was about.
I must say, the hardware is excellent. All the requisite connections are there, the startup welcome thingy is very cool in a spinning cube kind of way. But what the hell happened to the text rendering?
Text on the Powerbook really does look like poo. In an "I might have to return this thing" kind of way. There really is no comparison between the text rendering on my Windows laptop and the text rendering on the Powerbook. How could Apple get that so wrong?
Maybe it's just me. If anyone has any tips let me know quick.
I'm a big fan of the Wiki, we use an asp implementation internally (FlexWiki) for dev discussions, and have recently have been checking out TiddlyWiki. This thing is an astounding feat of engineering. A self-modifiable wiki contained entirely in a single html page, with no server-side requirements. Check it out.
I have an early DirecTivo (Sony SAT-T60) and a new HD Tivo and have always been envious of all the goodness that has been available for the Series 2 standard Tivo's (folder's, Tivo To Go, etc...) and when I saw this post over on PVRBlog I was fairly excited. Of course, thought my inner pessimist, I'm bound to be excluded somehow.
Ok, so now I'm jonsing for a TabletPC. It's all Scoble's fault. I want a lightweight, high powered machine. With gobs of disk space, high-end GPU, high-resolution display. Oh, and can it be nice and small please?
So what's the best bet? Robert! I need a buyer's guide please!
Last night I was fortunate to be invited to dinner with Mick Stanic (is it just me, or were Mick and Bono separated at birth?) from The Podcast Network and the G'Day World Podcast, along with a bunch of guys from the Media Center and Longhorn teams including Charlie Owen and Sean Alexander. Mick is a very smart fella, and lots of great conversation ensued around podcasting, Windows Media Center, and the state of technology in Australia. Very engaging.
Plus some great food at Desert Fire. I had the Bison. Yum.
So I was in Bust Buy today, ostensibly to buy a $45 DirecTV receiver for the bedroom (can you believe these things are that cheap now? With retailer markup and distribution costs, DirecTV must be subsidizing that thing heavily) but mainly to give Nishal and Shyama the consummate US retail experience.
So I was wondering around while they were drooling over the video cameras and went back to the bits'n'pieces section. You know, that part of the store with the weird cables and the odd peripherals, and found the Microsoft Wireless Notebook Optical Mouse that I have been jonesing over for ages. The Microsoft company store never has these in stock. I'm assuming that they sell out as fast as they arrive - a coworker has to order from the website and incur the shipping costs for his, but I had been holding out hope that it would be there time and time again when I visit.
Now I have one.
This thing is great. It has a little USB dongle and the diminutive mouse itself. I'd never actually held the mouse and was pleasantly astounded to find out that it is heavy. I was expecting a lightweight, flimsy thing, but the weight is perfect.
But here's the kicker and the reason I've been after one.
When you're done, you unplug the USB dongle, which clips in to the bottom of the mouse, turning it off.
That simple design feature is killer. How many times have you taken an optical mouse out of a laptop bag and seen the little LED flashing away? How much battery power do you think is wasted while it is tumbling around in said bag during a flight, commute, etc...?
What a cool idea.
And that's why I've wanted one. And now I have one. And you should have one too.
Somehow, my last post about my dead home dev box seems to have disappeared. Odd. Anyhow, the posts have been light of late as I've been pretty busy at work, my sister Sue and friend Wendy are in town from the UK, and I'm getting ready for the trip down to San Francisco for the Game Developer's Conference. It's weird the conference being in SF after so many years in San Jose.
But I digress...
As my sister is here and sleeping in the office, I've been shutting down my machine at night (previously it's just always been on) and then on Wednesday we had a power cut after which it wouldn't start up. Nothing. The leds on the motherboard would light up, when power was applied, but it wouldn't actually start. Plus there was an unpleasant capacitor-burned-out kinda smell. Time for an upgrade!
So off I go to the local PC Club, and end up picking up a new case, power supply, PCI express motherboard, a gig of memory and a 3.2 GHz processor. Cool! I love putting machines together, so yesterday afternoon, after a fun trip with sis to the Museum Of Flight (got to go inside Concorde and an Airforce One) I don the latex gloves and build the machine. After a quick test with no drives installed, everything worked fine. As an aside, I knew the case was transparent on one side, but I didn't realize that the motherboard has a very bright neon blue led on it. So now my machine is a lowrider of a machine!
Then it was time for the drives. It was my plan to just use my two, ATA-133 drives from the old machine instead of getting new SATA drives as they're new and work just fine. I thought I'd just pop them in and I'd be done.
First problem. There's just one IDE connector on the mobo and I have four IDE devices (the drives, plus two optical drives). Handily, the mobo came with a connector that converts a SATA port to an IDE one. I thought I'd use that for the DVD burner and just ditch the other one, using the real IDE connector for the two IDE drives.
No joy. It seems that if anything is connected to a SATA port, then it must boot from SATA, so I disconnect that and my old Windows install boots from the primary IDE drive.
Except it doesn't. It seems that if the chipset has changed, Windows needs a reinstall.
OK, so I'll just reinstall Windows, no great shakes. Disconnect secondary IDE drive, use that for the DVD drive, boot Windows from CD and install, boot from IDE drive. Ackk, stupid me. Now, of course, I'm denied access to all my protected content as the machine and users have changed. Fiddle, fiddle, calcs, fiddle...
Long story short, I decided to just get a shiny new 250GB SATA drive to go along with the shiny new machine. I'll use the 130GB ATA-133 drive as a target for backups and ditch the old 80GB one.
I must say, this new machine is damn fast. Those SATA drives rock.
Talk about a confluence of ideas! I'd been thinking about this on and off for a while since Julian was born and up pops a post by Joshua Allen talking about a post by Don Box.
I too had been thinking along the lines of a declarative/functional (e.g. lisp/ml/hope/smalltalk) rather than imperative (e.g. C++, C#, Basic) language for the very reasons that it requires reasoning, planning and design - but most of all thought. (Quick recap, my dissertation was the unification of a functional, lambda calculus style language with a prolog system).
I'll add another one to the mix: ML. I like ML's mathematical look'n'feel, but without the obtuseness of something like lisp. Maybe it'd be Hope, which I did a lot of work with at college.
Anyhow, should be fun. At the moment I can't stop him throwing lego blocks at the dog. Stopping him dereferencing null pointers might be easier.
The ZBuffer is a great website devoted to game development on the Managed DirectX platform. It also happens to be run by a friend of mine who also happens to be another Microsoft brit. Go check it out.
So the comment spam has pretty much died off, I only get a few everyday now (and those are effectively handled by MT-Blacklist), but I'm getting a huge amount of trackback (or ping) spam now. MT-Blacklist does a pretty good job with it, but about twenty or so a day get through the defences.
And what is it with the texas holdem'/poker spam? About 90% of the spam seems to be for poker sites.
Interesting application. Very nice rendering. It feels like there is something more going on there than there seems on the surface. I wonder how long it will be before keyhole is integrated somehow into that?
You may remember that a while ago I got my DSL bandwidth upgraded to 3MBit/768KBbit. When testing the line speed through DSL Reports, the upload speed had indeed changed but the download speed, though improved, was still around 740KBit/s. I put it down to net congestion.
Today, I called Verizon thinking that maybe I'm wrong and my bandwidth is being throttled somewhere.
They put a work order in to check the line and this evening I lost all connectivity. A quick modem reboot later and I'm surfing at lightspeed!
I just downloaded a copy of perforce for use at home and got around a 350KByte/s download rate. Cool.
Just downloaded the newly released second version of Picasa. I must say, this app is sweet, and they've done a really good job with this version. The UI is incredibly sleek.
One cool new thing is the collage creation. Select some photos, hit Create Collage, and out pops something like this:
Talking about photos, I've been playing some more with Flickr. The tags feature (that everyone seems to be picking up, including Technorati and apparently Google will be announcing something today) is great. You can even subscribe to RSS feeds of pretty much anything (by tag, person, etc...) For example, here is an RSS feed for my cellphone photos.
Around the beginning of 2003, you'll note a disturbing sharp turn in the previous trend toward ever-faster CPU clock speeds...
It has become harder and harder to exploit higher clock speeds due to not just one but several physical issues, notably heat (too much of it and too hard to dissipate), power consumption (too high), and current leakage problems...
Applications will increasingly need to be concurrent if they want to fully exploit CPU throughput gains...
Personally, when I moved out here to the US from the UK I couldn't believe that you actually had to pay to receive regular calls on your cellphone. I didn't initiate the call, why should I pay? Why is this any different to a regular land line?
It was two years before I caved in and got a cellphone.
Handily, I have a plan that includes unlimited SMS by virtue of the fact that whenever I get an email, I get an SMS message that causes the phone to sync my email. That must happen hundreds of times a day...
Tomorrow I'm off to the Game Tech Conference in San Francisco. Should be fun. This conference used to go by other names including the Hard Core Game Developers Conference. I went to that one in 1999 which had a very high SN ratio, so I'm looking forward to this one.
Hopefully the hotel will have wifi, and I'll blog from there...
Some good people speaking on some interesting topics.
When I picked up my SMT5600 I thought "Yey! We've finally caught up with Europe and Japan!" Then I find myself perusing a former College friend's blog and click on a link for 3 in the UK. (Btw, why are company names always cooler outside the States?). Check out their offerings. Very cool.
Looking at their coverage map map-app, the middle-of-nowhere village (Hassocks) that I grew up in has full coverage. 3G services including video calls...
Will we ever catch up? Or is the US always destined to be an also-ran in the arena of mobile tech?
Yesterday my car went in for a service, so I took the opportunity to get the iPod interface installed. It works just as advertised, with the 5 playlists named BMW[1-5].* showing up effectively as CDs and pausing when the car is switched off just like a regular CD player.
I had been using an FM transmitter, but there is no single station that I can tune it too that works for my entire journey to work - I'm forever changing the frequency on the trasmitter and the car. And that is just plain dangerous. Plus is sounds awful with all the bleedthough from other stations.
It would have been nice to get one of the cool third party solutions that adds a head unit that displays tag information, but I really didn't want a new head unit that looks out of place. So I just went for the first party solution.
My only niggle now is that my music library is at home, and I sync it every night (with iPodder downloading the new podcasts). I use firewire there. At work and on the road, I can't change the playlists or download new content using iTunes on my laptop, because as far as I can figure out only one instance of iTunes can work with the iPod. I can't mess with the playlists from my laptop. This is very annoying as I'll be at a conference in a couple of weeks and want to download new podcasts to the iPod while I'm there.
So yesterday I tried ephPod with the iPod connected to my laptop via USB. This allowed me to edit playlists and create news ones, but as soon as I docked the iPod when I got back home, iTunes decided to re-download my entire music library because the iPod was "suspect"! That took ages (20 GB of music).
I now provide a mobile friendly(er) version of my blog at /mobile.shtml. That was pretty easy - you gotta love Movable Type. You'll get sent there if you come to my site with the appropriate user agent.
I bought the new Audiovox SMT 5600. In fact, I bought two. One for my wife, and one for me.
I've been waiting for so long for the right 'phone to come along - a few years ago I got one of the first 'phones with a colour screen, but was disappointed and then I decided to just settle with it.
The Audiovox changed all that.
It has been incredibly popular internally here at Microsoft, and I'm sure you've seen Scoble rave about it. It just works. It does everything right.
The only thing is that I'd been really happy with Verizon as my carrier - the coverage, as they say, is just amazing. When I was in Montana earlier this year fishing with my buddies, I was the only one that had any coverage. But, unfortunately, they'd always been lagging with the new 'phone tech. So on Friday I switched to AT&T (the day before their accounting switch over to Cingular). The phone number port worked perfectly.
Now I just have to produce a Smartphone friendly front-end to my website...
I can't say enough good things about this 'phone. Go buy one.
Yesterday I got my car back from R&R Automotive in Bellevue, three weeks after the accident. I'll tell you what, they did a fantastic job - the car looks wonderful, and handily all those rock chip marks I picked up a few days after taking delivery of the car are now gone! Cool.
Next step is to get the BMW/iPod interface installed.
Talking of the iPod, the latest version of iPodder (the Podcast downloader) is a vast improvement over the last one. Last night it downloaded the latest "Dawn And Drew" show - including the album art which showed up automatically on the iPod. It's a little thing, but it gave me an "Oh, that's cool" moment.
In a "that's uncool" moment though, how come the iTunes UI becomes unresponsive when it's talking to the iPod. Come on folks, it's not that hard to get right.
So I upgraded. It was a no-brainer really once I went to the Apple store and got my hands on one.
The display is just wonderful, and that was the killer feature for me. It's not just the photo stuff (which is cool and just works), but the display, anti-aliased text and use of colour is beautiful. It's very sharp and vibrant.
iTunes got a minor upgrade to support the photo resizing and upload, but is obviously v1.0. It also uploads album art which is viewable in the 'now playing' section on the iPod. I'd still much rather use Windows Media Player. Speaking of which, why can't Apple just support WMA on the iPod. They support it (for import) in iTunes.
Lenn Pryor (of Microsoft) obviously likes it and he talks about it in the latest Engadget podcast. There are a bunch of photos up on iPodLounge (note, their server is slow - it's getting hammered).
A lot of people are saying "Why bother? Colour and photos isn't worth it." Just go play with it.
Coming to you live from the Triple J Cafe in Kirkland. Man, are there a lot of live wireless networks around here...
I picked up a HD Tivo at the weekend. I've been wanting to get one for ages and finally got around to it. Setup was a breeze. For a detailed review, check out Andy Pennell's website.
I stopped watching HD about six months ago, as switching backwards and forwards between my DirecTivo and the HD box was just too annoying. Plus the HD box wasn't a Tivo-type unit, so I couldn't record anything - and I basically don't watch anything live anymore.
The content available now is just great - yesterday I watched the baseball in HD on Fox, then American Chopper on Discovery HD Theater. Almost everything worth watching on the major networks is now in HD, plus with Bravo, HBO, Showtime, Discovery, ESPN, etc... now in HD on DirecTV, TV life is great.
So today Chris Pirillo and Adam Curry discuss the issue in their podcasts. Chris is vehemently for a name change, and Adam is "whatever dude" ;-)
Neither comes up with an idea for a name - I still think mobcasting is a great name (found at Russell Beattie's site - note to Russell, give up on the "Oh we thought of this N years ago"). It evokes the idea of a "mob" broadcasting (i.e. the people) and encompases the mobile idea in the same way (albeit in a platform agnostic way) as podcasting.
Although I still have the nagging thought that it isn't just mobile. I've been listening to podcasts via my Audiotron in the living room...
The name "Podcasting" has always sounded a little trite to me. It gets it's root obviously from the iPod, but why imply a limit to a particular platform? There is obviously a bunch of confusion out there right now as many people have been posting "but it's not just for the iPod, it'll work with other players too". So why name the "idea" podcasting, and the apps "iPodder" or the directory "iPodder.org"? It just confuses the issue.
I like the meme that's floating around the 'sphere right now of renaming as Mobcasting. As in mobile. All digital audio players can take play the podcasting content - don't artificially limit the promise of an idea with a bad name.
Anyhow, those ubiquitous cell phones are gonna be the platform this tech needs to hit.
PodCasts are generally quite long - longer than your average musical track and definitely longer than my usual drive.
Yesterday I started listening to Adam Curry's latest missive and about 20 minutes into it I arrived home, so I switched off the PocketPC. Upon resuming it though, Media Player had reset and was showing the first track in the track list. Damn. Fast forward through to where I thought I left off; listen whilst driving to Costco; buy stuff; do the fast forward dance again. Damn.
Time to dig out the Windows Mobile SDK. Handily, it should be fairly easy to host WMP in a .Net app and have it do exactly what I want.
Update 10/11/04: After updating the firmware on my PocketPC, the pause issue has disappeared, and I can resuming playing from where I left off. It would be nice though, if WMP remembered where I last stopped listening on a per track basis, so I could skip around.
So is it PodCasting or Podcasting? The world needs to know...
Anyhow, I digress. So yesterday I did pick up the Monster Cable FM transmitter cable so I can listen to podcasts in my car from my PocketPC. I like this product as the black and silver styling match my car's interior styling exactly ;-)
Seeing as I was listening to podcasts I thought I'd try this skin for Media Player too...
After reading the article Robert Scoble linked to yesterday about getting Media Player to auto-sync to media devices, I thought I'd try it out. It all works except that I can't get it to automatically do the sync, or delete stuff from the folder I point it at on the PocketPC when it's no longer in the auto-playlist. Am I missing something? Or maybe I need to write some code ;-)
The page load speed has been noticeably slow recently, so I've just updated all my referer tracking stuff to cache the generated data, and regenerate the data every 10 minutes. This should hopefully address all the issues. Shortly, I'll be removing the links to external tracking sites. Let me know if you see any issues.
The patch fixes software flaws that could enable an attacker to crash or freeze the Apache 2 Web server, run software by utilizing Apple's Safari Web browser or expose the password store used by the network. Security information provider Secunia ranked the Kerberos threat as "highly critical," its second-highest danger rating.
Typically, Apple has just talked about it's fixes to security flaws and patches as minor updates. At least now it is being a bit more open about it. I guess with Linux, you can just download the patches for whatever myriad version you're running from some site and just recompile the kernel. Sounds secure to me.
From: email@example.com (S J Lacey)
Subject: Fast/Chip memory
Date: 10 Nov 89 12:49:02 GMT
My problem basically revolves around asking AllocMem() for fast memory.
What ever I ask for (Fast/Chip/Public), I seem to get Chip memory. I
should point out that this is on an Amiga 1000 with 1Meg of ram. Ta.
Steve J Lacey.
---- Whatever I say is to be believed at your own risk...
firstname.lastname@example.org Department Of Computing
email@example.com Imperial College
180, Queen's Gate
This was written when I was working for Magnetic Scrolls whilst at university. I was porting Magnetic Windows (our runtime for the Magnetic Scrolls games) to the Amiga. This was written on a DEC MiniVax, and cross-compiled to the Amiga. This was fun stuff and involved getting the cross compiler running (based on gcc) writing the link loader for the Amiga when the only debug support I had was to issue an AmigaDos call that caused the current register status to be squirted out the serial port. Also, the Amiga didn't have a hard-drive. The assembler, source, etc... were all on floppy. Once the entire development system was running we could actually start on porting the game.
Sometimes I pine for those heady, low-level days...
Found over here is some information on a great looking Smartphone coming to the US. I want one.
But my problem is that I'm a bit of a neophyte when it comes to celluar technology in the US. Can I just take my current SIM card from my Verizon phone and insert it in the new phone? Will Verizon allow it? Do networks allow you to use any old phone?
I could probably just Google for the answer, but last time I tried, the signal to noise ratio was just too low...
As you already know, I like to run as a non-admin. What were iD thinking? The only reason the game doesn't work as a non-admin is that they try to write the config file and save games to \Program Files\Doom 3\base\, which you don't have access to as a normal user. Would it really have been that hard to save it to \Documents and Settings\<username>\Application Data\Doom 3\? No, it wouldn't have been hard, but due to this little bit of laziness, you have to be an admin and everyone on the machine gets the same settings and save games. Sigh.
The full article requires a subscription, but Salon has this cool feature whereby if you're willing to look at a quick advert, you get full access to the site for free. I really like this feature and wish more subscription based sites would employ it.
So the comments are all working now and the style sheets are cleaned up.
The problem I have now is the text input boxes look odd in IE. Ho hum, time to did out some css docs.
I also added MTAmazon and MTBookQueue to the site. This is a cool plugin to MovableType that has a little interface where I type in the ISBN of books and it generates the images and links on the fly by talking to Amazon.com via their web api. Pretty cool. You can see my usage of this on the sidebar to the left.
Pretty much all Windows users run with administrator priviledges. Stop doing that! This is the vector by which pretty much all viruses infect your machine (via email, web browsing, etc...). Because you're running as admin, when they execute they have the run of the machine.
All you have to do is take yourself out of the Administrators or Power Users group and become a Limited User (you'll also hear this called LUA or Limited User Access).
As a limited user, you can do everything you can normally do, except stuff that you need to be admin to do (like install programs, etc...).
In the UNIX world, people wouldn't dream of running as root (the UNIX equivalent of Administrator).
If you need to do an admin type thing, log out, then log in as administrator, or temporarily increase your priviledges to that of an admin via something like RunAs (right-click).
Aaron Margosis covers a lot of this from a technical standpoint in his blog.
Of course, there are issues that are still being resolved. Windows traditionally hasn't promoted the LUA thing, the soon-to-be-released SP2 solves a bunch of these issues. Also, a lot of software (unfortunately, including most games) require you to be admin to run (copy protection, dum programming). These are being addressed via developer education.
So, just do it. You'll thank me. And just think, as well as stopping the virus guys from nuking your machine, you'll stop little Johnny from looking through your personal stuff and finding all the porn (assuming you set the permissions on your stuff correctly ;-)
I haven't quite made up my mind on Digital Rights Management. As a software developer and musician, my gut tells me that DRM is good and piracy is bad. But hey, when I rip music (from my own CDs, bought and paid for), the first thing I do is turn all the DRM stuff off. I don't download music and I don't fileshare it either.
This is an interesting take on the whole issue, based on a talk given to Microsoft Reseach.