Steve's blog

A blog about what's going on in Analysis UK...

The Ghost of Arduino

One advantage the Arduino has over the Netduino is a lot of high power GPIO lines, 13 of them each capable of 40mA which poweres 2 LEDs per IO line fairly brightly, and 6 of these can be run as PWM which make LED's appear go dim or bright.

Armed with a spare Arduino, a little too much time on my hands, a ghost that needed adapting from 110V to work in the UK and a bunch of white LEDs I set about making an Arduino powered Ghost ready for Halloween.

Here's the result (first 12 seconds are dull, hold on in there):

Warning: This video contains some images of bright flashing LEDs, don't watch if you are affected by that!

 

 

.Net Micro Framework and Netduino Fun

The .Net Micro Framework has been around for some time now (it's at version 4.1), and as with all things Microsoft it's really taking off post the V3 release.

Sadly the development boards for the Micro Framework have been fairly limited in their range and price (read expensive!), until recently with the release of the Netduino, a .Net version of the popular Arduino platform.

This fantastic board is idea for hobbyist electronics hackers like myself, it's cheap, easy to connect to and fun.

I've got a few projects I'm tinkering with for the Netduino and Meridian/P using the .Net Micro Framework, and I'm writing up my experiences over on my IImplement.Net blog, if your interested in the Micro Framwork go have a look.

MSBuild AWS Task Library

Previously myself, Alan and Alastair had pitched in down the pub to create a simple MSBuild to S3 publisher task, mainly as a way for us to learn the Amazon Web Services (AWS) API. It turns out I wasn't the only one wanting a MSBuild task for S3/AWS, the week that followed saw Roy Osherove tweet that he was looking for something similar and Neil Robbins was looking for EC2 automation.

So with that, and a bank holiday weekend at hand the S3 publisher task got extended. It now supports:

  • More S3 commands

    • CreateS3BucketTask

    • DeleteS3BucketTask

    • PutS3FileObjectTask

    • PutS3TextObjectTask

    • DeleteS3ObjectTask

    • SetS3ObjectAclTask

    • S3BuildPublisher

  • EC2 Automation

    • AssociateIpAddressTask

    • AttachVolumeTask

    • CreateSnapShotTask

    • CreateVolumeFromSnapshotTask

    • CreateVolumeTask

    • DeleteSnapShotTask

    • DeleteVolumeTask

    • DetachVolumeTask

    • DisassociateIpAddressTask

    • RebootEC2InstancesTask

    • RunEC2InstancesTask

    • StartEC2InstancesTask

    • StopEC2InstancesTask

    • TerminateEC2InstancesTask

  • SimpleDB

    • CreateSimpleDBDomainTask

    • DeleteSimpleDBDomainTask

    • PutSimpleDBAttributeTask

    • GetSimpleDBAttributeTask

    • DeleteSimpleDBAttributesTask

  • Simple Notification Service

    • AddSNSPermissionsTask

    • CreateSNSTopicTask

    • DeleteSNSTopicTask

    • PublishSNSNotificationTask

    • SubscribeToSNSTopicTask

    • UnsubscribeFromSNSTopicTask

  • Simple Queue Service.

    • CreateSQSQueueTask

    • DeleteSQSQueueTask

    • SendSQSMessageTask

    • DeleteSQSMessageTask

    • ReceiveSQSMessageTask

    • WaitForSQSMessageTask

If you want to see how to use the .NET SDK for AWS have a loot at the code, it gives a really simple insight and the code for the MSBuild task library is licensed under MS-PL so you can take what you want.

Since the last blog post I've changed the build script so the files are now zipped and distributed to S3 so you can download the latest MSBuild AWS Task Library build, I am also uploading builds (not automated yet) to the download section of the SnowCode Google code repository.

Alan pointed out theirs a much better Url to svn the code from: https://snowcode.googlecode.com/svn/trunk/2010-04-TramDepot/Snowcode.S3BuildPublisher without the rest of the SnowCode.

Lasest Builds here: http://s3.amazonaws.com/MSBuild-S3-Publisher/MSBuild.AWS.Tasks.Release.zip

Official Downloads here: http://code.google.com/p/snowcode/downloads/list

 

 

MSBuild S3 Publisher

Recently a few of us got together in a local pub to enjoy a brief glimpse of the British sunshine, some code and the nice beer served at the Tram Depot.

The Goal: (Other than to drink beer)

We wanted to learn about the AWS Api and create a MSBuild target that could copy files to S3 as part of a build. I'm currently using S3, CloudFront and EC2 for hosting and storage to support Dollars2Pounds and my various other websites but much of the deployment still requires a variety of tools and manual steps so a automated build for this would make life simpler.

After a lot of flapping around trying to get the Three portable WiFi hotspot to work, some food and a drop of beer the coding began.

Creating a MSBuild task is really simple, Bart De Smet has a great custom MSBuild task cookbook blog entry which helped us get the basics in. Next we fired up the AWS SDK for .Net, that again made things very simple.

A few lines of code (and another beer!) later we had the basics in place, a MSBuild task that would create a S3 bucket and add an object to it. Sadly we had to stop coding at that point so we could catch the first part of Dr Who's weeping angles (did I mention we are all geeks?)

I finished the S3 publisher off latter that evening, so here's how how to use it.

Download the code from snowcode.google.com

svn checkout http://snowcode.googlecode.com/svn/trunk/ snowcode-read-only

Note that this also contains a stack of code from the recent SnowCode event.

Navigate to the 2010-04-TramDepot/Snowcode.S3BuildPublisher folder and either open and build the Snowcode.S3BuildPublisher.sln solution or use MSBuild:

MSBuild Build.proj /p:Configuration=Release

You can then use the binary file in your own MSBuild scritps with:

<UsingTask TaskName="S3BuildPublisher" AssemblyFile="Snowcode.S3BuildPublisher.dll"/>

<UsingTask TaskName="StoreClientDetailsTask" AssemblyFile="Snowcode.S3BuildPublisher.dll"/>

 

The task library currently consists of two tasks:

StoreClientDetailsTask

Use this task to store your Aws credentials on the machine. They get stored in the registry and the secret access key gets encrypted on the way using the EncryptionContainerName. This saves you having to embed your aws credentials in build scripts.

S3BuildPublisher

This is the main task which will copy the files to the appropriate S3 bucket, and takes the following parameters:

EncryptionContainerName – this is the container name you used when storing the client details, without this the secret key cannot be decrypted.

SourceFiles – single filename or a MSBuild array (i.e. @(files) ) of source files from a ItemGroup.

DestinationBucket – the bucket to copy the files to, will be created if it does not exist.

PublicRead – if set to “true” the files will be marked as public readable, otherwise they will be private.

The source contain two examples of using these tasks, the PublishData.proj file is used for debugging the tasks and the Publish.proj file is used during a CCNET build to publish the binaries to S3, using the task it's self.

It currently has some limitations, sub folders are not supported, error handling is very light weight, no choice of aws region.

We have released this under the MS-PL license so do as you please with it, if you like it why not join us for some more CAMDUG pub coding sessions, perhaps next time in a pub with WiFi.

Thanks to Alan for his help and driving the keyboard and Alastair for the photos and input.

 

Why you should do other stuff...

As a software developer it's all too easy to turn up to work, do what's required, go home in evening and once a month pick up your pay. Just where does that get you?

Well maybe your happy doing that, then 5-10 years down the line you find you've been doing the same thing, the same way, never really pushing the boundaries or questioning if what you are doing is the best way to do it. Net result - you've failed your self and your employer, and if you end up needing a new job, well we all know how picky the tech industry is about having a perfect match of acronyms on your CV.

 

 

Ever since I had my first computer I've always had a personal project, be it software or hardware. These projects are for me to solve problems I have, every now and then one of them makes it out the door and I share it with the rest of the internet.

Ten years ago I was a Lab Rat, working as a scientist testing blood gas analyzers and pH meters. In my spair time I had I started buying bits from the US (I'm in the UK) and trying to figure out the cost in pounds wasn't that easy (open calculator, figure out if it's divide or multiply, press buttons, hit equals). So I started Dollars2Pounds.com, mainly for myself, and the 10 other people who initially visited the site every month. That's now grown a lot, but is still true to it's roots, an easy way to calculate the cost of something in your local currency.

If I hadn't started Dollars2Pounds.com I wouldn't have touched any web programming. In 2002 I went a step further and built a beast of a site in php, now that's taught me one thing, I'm not a fan of lots of php, it's great to get started with but once things get busy, ohh boy it's not so great.

The thing is, during the time I was building those sites my day job involved connecting up scientific instruments to computers, one way or another. I'd never have touched web based programming if I only did my day job. And you know what, that would have been a huge mistake. I've learnt so much from escaping the binds of the day to day work and trying new things.

I can't emphasise enough as a developer how eye opening it is to release a product or service your self. It's all to easy to sit in our cubical wearing our Dilbert curled up ties and laugh about marketing and sales, but until you've actually tried it and seen the results you will never understand how frustrating it can be, or how amazing the feeling is to see people using YOUR product and how strange the things that your users do truly are, and the actual challenges your product faces on the internet, not just the perceived problems where you think you need to shave a microsecond off of some routine that it turns out people don't use anyway.

Fast forward 8 years, I've recently transitioned Dollars2Pounds.com and the network of other exchange rate sites from a dedicated server host onto Amazon's AWS/EC2(1).

Over the years my websites have grown, from the original Dollars2Pounds.com, shortly after followed Pounds2Euro.com, Dollars2Euro.com, Dollars2Yen.com, as well as Yuan, Won and Rupees, these combined result in a network of around 23 websites.

When these sites were in php I was maintaining a core set of functionality and then some individual domain specific pages, over the years I've put a bit of work in here and there (for the most the sites just run themselves), and moved over to ASP.NET (thanks to listening to DotNetRocks), I've consolidated the code base into one single codebase with just a seperate configuration file, database and installer for each site.

That's been the story for about a year now, my builds were taking >25 minutes to build just the installers (3 mins for the main app), the web msi installer that comes as part of VS2008 has problems, I've found installing upgrades on the site would often not update files, so upgrading the websites involved running un-install on 23 msi's, installing 23 new msi's and then saving them away to be able to automate the un-install next time around.

I also have a CCNET project that goes and runs some basic tests on each of the 23 websites to ensure that the correct configuration got installed, the site is performing ok and is not down.

 

So here's the problem:

I was kind of happy with that, it wasn't ideal, I didn't like the 30ish minutes between me checking in a change and being able to start a roll out, not to mention the trouble in updating 23 different websites, but it worked.

More and more I see company specific sub-domains (i.e. analysisuk.fogbugz.com for my bug tracking) on websites and I'm a big fan of that style, it makes me feel special, unique and I know from Spolsk's wramblings that I have my own little database for my bugs so a bug in the fogbugz code base isn't likely to show another company or competitor my bugs.

I had always wondered how sub-domaining website was done. Was a new IIS/Apache website created and the appropriate files copied over and a new database created in the background? (which by the sounds of it is how StackExchange is currently working). How about DNS updates?

 

The Soultion:

I was investigating another project thats been on my mind, this project would greatly benefit from sub-domains like the analysisuk.fogbugz.com one. So I did a bit of research. Ideally it would be one website that connected to a different database, or just had an additional identifier in the database based on the subdomain (the analysisuk bit).

Well it turns out thats all fairly easy.

  • On IIS you can add bindings each domain you want to resolve to the website. (e.g. bindings of CompanyA.AnalysisUK.com, CompanyB.AnalysisUK.com for a single website will send both CompanyA and CompanyB to the one site, or you can have a single IP based site and have the DNS records point to that IP address so no need to setup bindings).
  • In ASP.NET you can get the Host server variable to tell you the full host name (possibly including the port). So you might get something like “CompanyA.AnalysisUK.com:80” (HttpContext.Current.Request.Headers["HOST"])
  • it's important to also really include the VartByHeader=”host” in any page caching you do, otherwise CompanyA might see the page for CompanyB and you really don't want that happening!

Bingo, multiple sub domains on one IIS website with a master database giving a host lookup to get some kind of unique database or identifier.

Shortly after finding this out it occurred to me that this solution would work well for Dollars2Pounds.com it's network of sites. The host variable gives the domain name not just the subdomain, I can't believe I didn't think of that before.  This was the missing link, between one way of working and another.

And so the result of a Saturdays worth of coding is a new table to the master database to give domain name resolution (e.g. Dollars2Pounds.com and Pounds2Dollars.com both resolve to the Dollars2Pounds website). Configuration was shifted from web.config files to a simple database table keyed by the website, strings for the website are now keyed by the website and string key.

Twenty three separate databases merged into one database, twenty three IIS websites merged into one website, a 30 minute build became 3 minutes, a 30 minute deploy became a 5 minute deploy, the web server CPU baseline load went from 20%+ to about 10% just by removing those 22 extra websites.

And whilst the benefit of a quicker build and deploy sound nice what's missing from that picture is that it also means I'm more likely to add a small feature to Dollars2Pounds and roll it out quickly which is a massive benefit. I'd tinkers with advertising changes some time ago but never released the code.

The downside? Well there is a small downside, I use to like to roll out changes to one of the quieter websites first (Dollars2Yen.com or something like that) to check that I hadn't missed something in testing that when deployed on the production server wouldn't stuff up the website. Now if I stuff it up I stuff them all up!

 

What's my point?

Do something different, it might just pay off big time for your main work.

If your job is writing embedded code, go build a website in your spare time, it doesn't matter if it's a failure, you will learn and that means it's not a failure - failure is not doing it. If you job is writing websites go grab an Arduino and play, it's fun and again, you might just learn something.

If you are an employer take Googles advice. Let your developers have 20% time to try something totally different and radical, you might just find your main product benefits. How many big companies do we see spending a fortune to buy a small company with some technology thats different from the big companies. Maybe if the devs had 20% time that would have grown in house and saved a fortune?

And if you are a job seeker, avoid like the plague companies that don't like you having projects in your own time, they are welcome to the 9-5 developers who go home and watch TV or who go down the pub in the evening and don't touch a computer until the next morning.

 

(1) AWS in its self backs up what I'm trying to say here, look at Amazon, look at AWS, the book seller who's now selling compute and bytes by the hour, who'd have thought it, but it's massive. Again, Amazon tried something different outside of their day to day work. If Microsoft had done it, well no one would be surprised, the surprise is they didn't and now they are playing catch up with Azure.

 

 

 

AB testing with clear results

Like me you've probably heard various voices on the internet speaking of the importance of A/B testing. I've listened and agreed and tried a few bits here and there, but I've never seen such an obvious result as with my latest Adwords.

You may have noticed I recently launched RememberTheBin.com, a reminder service so you don't forget to put the bin out. Initially I set up a Google Adwords campaign with one advert and that didn't get much interest at all so I added a second very similar ad.

Here are my two ads:

Initially these ads were getting served equally, fortunately Google has kicked in and realised it's not making money from the second one so stopped serving it so often.

The “Free SMS, Twitter and email reminders to put the bin out” ad was my first shot and am I glad I decided to do an A/B test on it. Talk about a useless ad. No clicks what so ever, zero, nothing, nada, zilcho and that's over about a month!

It's interesting to note how similar the text of the two ad's are and how different the responses are, they both have the same keywords, cost and even the same words!

My question to you is this: Are you running AdWords, or even SEO, or specific landing pages? Have you tried AB testing? If not, go, go now and try!  I'll wait for you...

Which brings me nicely onto the SEO AB testing. That's a lot more difficult and time consuming because you want the search engines to update their index with what you want them to see. Instead invest some cash in Google AdWords and play with the AB testing through that and see what people click on, what gets served more, then use that information in your SEO campaign.

 

Introducing NukeThemPeeps.comm

I've just launched a new website, NukeThemPeeps.com, it's based on microwaving those tasty little marshmallow Peeps(R).

When I first found out about putting a Peeps in a microwave to blow them up the idea appealed to me, it's taken some time for me to actually get this togther, but now I think it's the perfect time, launching with Snowman Peeps having just had some snow here in the UK, together with technology being on my side (You Tube, cheap home video camera, open source blog softare). So finally NukeThemPeeps.com has been launched!

Head on over to NukeThemPeeps.com and see what a Marshmallow Peeps looks like when it's nuked!

 

Peeps is a registered trademark of JustBorn Inc.

I Have A Dream

Tonight, after a horrible drive I returned home to a cold flat, my central heating had broken. I had no heat and no hot water.

Fortunately I'm a bit of a DIY nut, a few quick checks on the boiler, it had electric, I tried switching it off and on again, still nothing. Turning on the hot tap, the boiler made a noise, life! - but no heat. I checked the pressure gauge, low, very low but still green and no warning lights. I was all out of ideas until I decided to try and put some more pressure in the system. The pressure gauge rose and then the boiler fired up – yay, heat, lovely heat and hot water a bonus!

So how many (be honest now) readers would check the pressure and put more water in the system? I'm guessing very few, and that's fair enough, personally I'd have been pissed if I'd spent the night in the cold and then an emergency visit from the boiler repair person resulted in a 2 minute fix to add some pressure. Wouldn't it be better if you could point your computer at the boiler, have it return an error code which linked to a page on the manufactures website, with suggestions on how to fix the error.

This hasn't been the only time I've returned home to a problem. A few weeks ago I returned home to find the fridge door open. Everything was fine, but I was lucky. Now how many people have returned home to a flood? Or some other situation that you would like to have known about earlier on.

I have a dream, a crazy one, and one that's very unlikely going to happen. I've been thinking about many of these things for a while now, so maybe, just maybe if I tell someone it will come true, failing that we might just take a step closer, so here goes...

Every appliance in the house should have a network connection. There. Simple hey!

Now I know it's added cost, and manufacturing is kept as low as possible, but I would buy the next model up if it had a good network port and wasn't a blatant rip-off for what it was like so often happens.

So what should be networked?

  • Central heating? Yes
  • Microwave? Yes
  • Cooker? Yes
  • Fridge? Yes
  • Freezer? Yes
  • Washing machine? Yes
  • Tuble dryer? Yes
  • Light switch? Yes (I'll explain latter).
  • Air con? Yes.
  • TV? Yes
  • HiFi? Yes
  • Satellite receiver? Yes
  • Wheelie Bin? Well, maybe a stand for it, waste by weight? Sneaky neighbour using your bin?
  • Any more? Yes – everything (except the kettle and kitchen sink)
  • heck, maybe even the kettle and kitchen sink (how much water do you use to wash up?)

 

I'm a big fan of home automation, you've probably figured that out from my dream, but the market sector is a horrible mess and the consumer devices are generally cheap (yet expensive) and naff.

I've got lots of X-10 devices laying around the flat that are no longer user (a very disappointing and expensive hobby in the UK).

Now I've got some Home-Easy devices kicking around, and for the most they work well, but the range is disappointingly limited. A few simple changes and Id be over the moon, but for now, it's still naff.

So what should this network connection onto these devices do?

I'd love to see two things, first and most importantly reporting. Lets have some reasonable sensors wired into the network, then secondly control of the devices as an added extra where possible.

A few examples:

  • Central Heating:
  • Reporting - internal water pressure/flow, water supply pressure/flow, gas pressure/flow, flame temperature, last on, last off, send and return temperatures for radiator system, system age, firmware version and importantly error code.
  • Control – on/off times if settable. Manual override of on/off.
  • The Fridge:
  • Reporting – internal temperature, door status, light bulb status(1), alarms, last on, last off, power usage.
  • Control – operating temperature, alarm hi and lo points.
  • The Washing Machine:
  • Reporting – wash progress, efficiency (load/cost), age.
  • Control – On time, Off time. I have a Home-Easy device on a timer so my washing machine will start automatically in the morning and shut off late at night.

Now a lot of these values are currently measured one way or the other but not reported, and some are not measured, such as power usage. But something like the fridges temperature are so core to the operating of that device it would be hard to find one without it.

How about control, well a lot of device still use mechanical systems, and maybe they should stay that way, although I wonder if it might not be cheaper and easier to provide the control through the network port and some solid state electronics.

Now I'm not suggesting every device implements a full blown web site with Ajax styled web pages and all that, that's a recipe for disaster! (2), instead what I would want to see is a REST based API interface, you point something at the device, it returns a list of capabilities and endpoints that you can connect to and query the devices sensor values.

Why REST, well it's very simple, just hit the endpoint, maybe with a parameter or two and it returns you an lump of xml, fantastic for automation, and if you want to make a nice UI to go with it, no problem – the device manufacturers could then have a skin-able application that the consumer runs on their PC or Mac (even if it's just javascript downloaded from the devices web server) and you've got a nice UI where you can provide much better feedback.

Where does this lead us, well you end up being able to monitor what's happening in your home, make changes based on the results, prevent systems failing, or catch them earlier – wouldn't it have been better for me to find out my fridge was open as I was leaving the house, or that the heating had failed at lunchtime when I could have been able to get it seen to in the afternoon? How about a warning when I go to bed that I've left the oven on or that the front door is unlocked?

Lets talk more about the fridge because theirs two other aspects that are interesting. What happens when the power fails? The fridge and freezer start warming up, and with no power (perhaps a blown fuse), chances are theirs no way for the device to warn you until it's too late.

Well, the network port can provide a low level of power, enough to ensure the monitor and reporting system can be kept alive. Currently corporations everywhere are rolling out Power-Over-Ethernet for use by VOIP phones, how about consumers, well POE devices are coming onto the market, many of us have decent broadband, so how long before we want a nice desk phone connected to the internet giving us free calls?

So lets leverage that, a POE network switch, combined with a small UPS to keep the power going to your fridge, freezer the little magic box that monitors them, and your broadband router, so you can get a SMS message or email when something goes wrong and the devices get to keep power when it's out, and get to tell you theirs a problem.

If were going to implement Power-Over-Ethernet we can even go a step further. This is where the light switches come in. Lets do away with the old light switch, it's got problems, pull out that twin and earth cable feeding it and drop in a Cat 5 cable, attached to a POE switch at one end and the other, to a new lighting switch, that has a motion sensor, a light level sensor, a microphone and possibly speaker, and to top it off it's a touch screen like your iPhone.

Where does that take us? We get much better control over the lighting, the ability to control the lights in another room (left the downstairs light on after going to bed?) The ability to automatically switch off lights in rooms when the daylight is enough, or theirs no activity in the room, and maybe even that star trek intercom system where we can page another room (kitchen to bedroom page for the kids anyone?)

I have a Home-Easy sensor in the kitchen and some small cabinet lights hooked up to a Home-Easy power switch, when I go in the kitchen the lights go on, their usually enough for my needs, when I leave the lights go off after about a minute. With the X-10 set-up I was able to turn any other lights in the kitchen off as well, but I can't do that (3) with the Home-Easy set-up.

Did I mention the idea of a UPS on the POE system to maintain power? Well lets push that a step further as well. All the rage now days are LED lighting, low power bulbs with long lives. What happens if it's dark and the power fails? People fall over, light candles and set fire to the house? How about using LED lighting, either the odd lamp or having a nice arrangement of ceiling lamps like we see with the halogen ones now days. And how about if they were networked as well. When the power fails one or two of these lamps could be driven in low power mode from the POE switch, the back-lights on the light switches can come on to help the occupants find the door, the sensors can figure out of theirs anyone in the room and save emergency power by switching off the lights in that room.

 

How about if theirs a fire? You can get smoke alarms that send out RF signals, you could monitor that and in the event of a fire you could switch on lights to help the occupants find their way out. Maybe the light switches can determine what rooms have people in, or excess heat to help out the fire brigade.

So I've diverged from the fridge haven't I. How many times have we heard about the fridge that figures your out of milk and orders it for you, we see photo's of the fridge with a LCD monitor built in – now that's silly as them things get hot, it adds a lot of expense and will no doubt provide a poor user experience because hardware people just don't get software (4)

You know, it would be great if out food had RFID tags and the fridge and freezer could work out what's in them, oh, and the bin could also (5), but really those devices should just make this information available via the network port so we can get it from where ever we happen to be parked with out laptop, or iPhone, perhaps even in the supermarket, query your fridge to see what's missing whilst in Tesco's anyone?

If theirs ever a time to start pulling all this together it's now, the technologies there, the green movement is on us to save energy, electricity suppliers are going the way of smart meters so we can take advantage of cheaper electricity, or if were not careful, use more expensive electricity, broadband adoption is huge, and people are embracing the web more and more.

Infact some of this is already happing, you can DIY your own home sensor network with ioBridge however that's more for techy geeks like me and I believe it still needs a server, I want to see that stuff build into appliances so all you have to do is connect up a network cable.

Which brings me onto another issue, the network cabling, not everyone's like me and got 8 network outlets in the kitchen Cat 5 cable is really easy to install, it's a lot less dangerous than mains cable and it's fairly cheep, the only difficult bit is making a nice connection at either end, and that's just a case of punching 8 colour coded wires into the 8 colour coded terminals on a patch panel or outlet.

I'm not the first to wire up my pad for networking, and I sure won't be the last, Scott Hanselman has a great series of blog posts Wiring the new house for a Home Network althought I think it should be said both our set-ups are perhaps a little OTT (I could make do with 1 patch panel, 1 switch and 1 UPS, that would remove 5 of the devices in the rack!, but then I'd have nothing to play with).

Scott's set-up is quite large, personally I've got a small 19 inch case in the loft which is descreet and easy to manage, but only does the cat 5 network cables. Now not everybody wants to have network cables so you could argue, that like my Topfield 6000 PVR it should be wireless. If you've ever tried setting up a Wi-Fi network on a device other than a PC you will know it's a hideous job, best bet is to have a simple network connector on the fridge/freezer etc., have it default to DHCP to get it's IP address and then use a Netgear wireless link to do the wireless bit. Putting a little LCD screen on the fridge just because you want Wi-Fi is going to make the costs even worse.

Hopefully in the future part of the first fit of a house will include dropping some network wiring into each room, TVs now come fitted with Ethernet network connectors and hopefully more and more appliances will, so hopefully new build houses will improve on that.

Why do I think this is never going to happen? Well theirs a few reasons:

  • The general aim of manufacturing an appliance is to keep costs as low as possible, throwing in sensors and a network port add to that, and the cost of that extra stuff can easily be multiplied by 4 by the time the various middlemen take their cut.
  • Hardware manufactures generally don't get software, and putting high tech stuff into a low tech fridge is probably something the manufacture is not that confident at doing.
  • Software support , hardware people are generally rubbish at software, which means that if we do get the network port it's likely to be a poorly implemented website rather like the one offered on my toppy.
  • Does anyone other than me and a few random geeks actually want this stuff? Now the basics maybe, and I'm sure a few people wished they could get an alert when the freezer is defrosting so they could save the food, but the overhead of networking up the appliances? Too much maybe. Perhaps we need a simple short range wireless set-up rather than ethernet?

Anyway, there you have it, my crazy idea for a tech filled house.

What am I doing to move along that route and why am I sharing this dream?

Well I've got a little Arm micro controller running .net micro framework and a network adaptor for it, plans include the light switch I talked about, light control using LED lamps and networked temperature monitors. Oh, and somehow I want a device to track what's in the fridge and what gets put in the rubbish bin for a new website I'm working on.

Making a hardware platform to sell is really tough and expensive, so theirs no hope of me doing that, so sharing my ideas hopefully will help those that can and do.

How about you? What do you see in the future house? What tech would make your life better.


  1. Finally I will be able to see if the light stays on when the fridge door is closed!

  2. My Topfield satellite receiver has a web server with nice little web pages in, but it's limited to what you can do with those pages and little automation is possible via that interface.

  3. I have a small Arduino project in progress to fix that issue.

  4. And that's one of the main reasons I say we won't see this, because the hardware manufacturers don't get software, and when they think they do the result is usually poor the only good combination I've seen around is Apple.

  5. One of my wishes for ThreeTomatoes.co.uk is to detect what's going in the bin.

This blog's moved

Well if your reading this you've probably managed to work that out as it is! With a great deal of pain I've ported my old S9Y blog to BlogEngine.Net, mainly so it would run on a windows Server – yes I know S9Y in php should run on a Win Server and it was, but in a push to move to a new server I just haven't managed to get S9Y working on php 5.2+, to be fair I think it's the php install not S9Y, but still.

So, the rants and random bits which are my blog is now here at blog.analysisuk.com

Some interesting points come out of this.

  1. It's really difficult to transfer from one blog platform to another. Choose wisely!

    1. Now GoDaddy appear to have finally got their S9Y import working having wasted $9 before finding out it wasn't I found a free way (when you've got a domain with them) to try this. I still wasn't happy so didn't do that.

    2. Wordpress have a reasonable import, but that wouldn't so S9Y but I was able to export from GoDaddy's blogcast thing. In the end I still wasn't happy with a hosted wordpress blog.

    3. Eventually I went with BlogEngine.Net which has a tool to import blogs from RSS feeds, as it turns out that doesn't work to well with S9Y either (misses out the entire blog content and also the BlogEngine.net webservice has a bug or two), but as it's all .net I was easily able to hack a few lines of code to get it working – hopefully I'll tidy it up and send the diff to the team as it's on codeplex.

  2. Choose your blog's location carefully.

    1. Now one of the problems I had was that I chose AnalysisUK.com/blog/ originally for the blog location, when I moved to ASP.NET for the AnalysisUK.com site that meant I needed php and ASP.NET (MVC) working smoothly together for url lookup, it all worked but everytime I use a web installer to update the site it would break the php settings in IIS 6.

    2. using the subdomain blog.* appears to be much nicer as it seperates off the blog and you can even point it to a hosted service.

So, the long and the short is, new url, new rss feed and even a new physical location in the cloud for this blog, I've also been shuffling a load of my other sites around and now a lot of them are running on an EC2 instance which has been really interesting to do, that warrants a separate blog post so I'll put one together soon for that.

 

 

I Met Jeff Atwood And All I Got Was This Sticker

Coding Horror sticker

What a week last week was, my brain is seriously hurting for two, no wait, three reasons.



So naturally I have to say a big thank you to everyone that presented at both the London and Cambridge conferences. If you didn't go you missed out, huge amounts of quality content from great speakers, all jam packed into one day. Also thanks to Carsonified for organising the events and to Neil Davidson Red Gate's joint CEO for pushing to bring DevDays to Cambridge – fingers crossed he does the same for the Business of Software Conference one day!

Unfortunately by the time the Cambridge DevDays was announced I had already purchased the London ticket, but at only £85 it wasn't such a problem, sadly their was a lot of duplicate content but enough not duplicate to make both days worth the money. I was pleased to see the Android and iPhone talks in London, as well as Jon Skeets talk (turns out he is real and not a robot designed to test the limits of the Stack Overflow scoring system).

The real bonus for the Cambridge event was the post DevDays party, how often can you say “I spent Friday evening in a pub with Jeff Atwood, Joel Spolsky and Ryan Carson”, and yes, free beers from Red Gate! - apparently their a great company to work for (did I mention free beers?). I got to speak very briefly with Jeff as he was trying to escape, now most people swap business cards, Jeff on the other hand gave out stickers – an awesome Coding Horror sticker, a testament to Code Complete and Jeff's Coding Horror blog that I've been following for years. The only question I have now is where is worthy enough to stick it – probably on the front of my Dev box (I tend to keep the case and upgrade the components as it's a great case – so it will stick with me for years – sorry about the pun!).

Apparently this was Jeff's first time in the UK, so Jeff if your reading this I hope you had a good experience and got to see a slightly different side to the Austin Powers version of the UK. It was great that you decided to come over.

If you didn't make this year it sounds like theirs a really good chance they will do the same again next year, and thanks to the careers.stackoverflow.com special offer they announced that was only available during the coarse of the DevDays talks, you get to sign up for $29 for 3 years rather than the normal $99/year, that alone pays for the ticket very quickly (assuming your serious about taking control of your job hunting and are looking to work for companies who love their devs rather than certain others that don't appreciate the value a good developer brings) – it will be really interesting to see how the careers site changes the face of recruiting.

DevDays badge

Write ups of the DevDays events are slowly starting to appear so I won't boor you with my half remembered version of the actual content, for London check out http://www.horsesizepills.com/2009/10/dev-days-london.htm or http://johntopley.com/2009/10/30/stack-overflow-dev-days-london-2009 and for Cambridge http://thom.org.uk/2009/10/31/diary-of-a-schwag-hag/ possibly the only thing missing was that in London Joel entered wearing a union jack cap and hideous London T-Shirt – very tourist, I was hoping he was going to wear a cap and gown for Cambridge, but sadly not. I should mention he changed out of the tourist outfit very quickly after his first talk!

Probably the biggest let down of the two days was my HTC Magic battery not really making it past lunch. Ouch! That's what you get for trying to use Wi-Fi on your phone.

Well I'm off to fry my brain some more with the excellent TekPub NHibernate videos, I really want to watch the GitHub ones as well but I don't think my grey cells can cope with that on top of everything else. Huge thanks again to everyone involved in DevDays and for bringing it to Cambridge.