Seems like NRK has started releasing their own production using bittorrent, and it’s even fairly high quality (1024×576, 24 fps, H.264 @ 3 mbit/s). Awesome, it isn’t HD, but good enough for this kind of use. Lars Monsen’s Nordkalotten 365 is the first show to be released, at the moment there are only one episode out but they will release the entire series over the next couple of days.
This is really a step in the right direction: distributed download and high quality video in a good format. Much better than their net-tv. Their first release is just to measure the interest for this kind of content, I hope this will be successful and that we eventually get NRK’s entire media library online in the same way. Recommend that everyone download this, even though they don’t like this show, just to let NRK know that we are interested. And who knows, you might like the show, I bought the book some months ago.
Torrent of first episode: Nordkalotten_365_ep_1.mp4.torrent
RSS-Feed for the series: Nordkalotten365.rss
In order to sleep better at night I’ve decided to be better at taking backups of important data. Knowing myself it would need to be automated, and also preferably offsite. Also didn’t want to depend on my own servers (since they are down from time to time, lately I’ve been fighting an ethernet switch that stops working until someone cycles the power.. very annoying, it’s getting replaced now..). I had a look at amazon S3, which seems to be fairly popular these days. $0.15 per gb per month of storage, and a little bit for bandwidth. That’s basically free for my storage needs, especially considering the current dollar value (when are they going to rename it american pesos?). Sounds great, cheap offsite backup. However I want to encrypt my data, since I don’t trust anyone further than I can throw them in matters like this, and it’s hard to throw someone the size of amazon.. I had a look around and found duplicity, which does encrypted incremental backups. It’s made in python and uses librsync, it supports loads of different destinations – for example scp, rsync, ftp and even S3! So I installed the newest version of duplicity and it’s dependencies ( boto – for S3 support, the rest can be found in most distributions ) and also made a GPG-key specifically for backup use (gpg –gen-key). And made a little script:
# Amazon S3 keys:
# GPG passphrase and key:
# MYSQL password:
mysqldump --all-databases --password=$MYSQLPW > /root/mysql/mysql-backup.sql
# Force a full backup twice a month..
if (( "$DATE" % 15 == "0" )) ; then
# Don't really need more than 2 months of backup..
duplicity remove-older-than 2M $DEST
Everything i want to backup is listed in the file backuplist.txt, and the script runs once every night using cron…. This should cover my backup needs, if I only remember to backup my GPG-key :p