When the Internet of Things doesn't play nice, who can help? Tado Home Heating?

I’ve got an ongoing problem with the tado bridge not appearing to obtain an IP address from a BT Home Hub 5 or the BT Home Hub 2.

The reason I choose to go with tado over, Hive or Nest is the cross platform control - iPhone, Android and Windows Phone.

On a Home Hub 5 with the Home Hub reset to factory new we see the following:




Showing the laptop called SCRATCH plugged into LAN1 and something plugged into LAN4 (that something is the tado bridge).
The DHCP table does not show the tado:





But the event log does have mention of it:


So it appears that all of my other devices (Xbox One, Xbox 360, PlayStation 3, Wii, Sky TV, Surface Pro 2, Surface Pro 3, iPhone 5s, Windows Phone 1020, Dell Latitude D630, Home built PC) all connect quite happily onto the Home Hub 5 and can route to the internet.

But not the tado bridge.

As I have an old Home Hub 2 (HH2) I thought I would try that to see if the tado bridge could get a DHCP lease.

The Home Hub 2 cannot go onto my home broadband (I have fibre instead of DSL so the modem in the HH2 is not compatible) as such I rigged up a simple network of the HH2 with two Ethernet connections: 1 – tado, 2 – laptop (called “SCRATCH”)

In the HH2 control panel (which is quite basic) all I can see is the following:


This shows that the tado bridge has not connected. Looking in the DHCP table on the HH2 we see this:

(the event log has nothing for DHCP)

Short story long: the tado has not attempted to or got an IP address on either of the Home Hubs.

So far tado support have said that it must be my router as I took the bridge to work and it did get an IP address on a different model of router (NetGear). I agree with them, it is a problem on the BT Home Hub 5 (and maybe HH2 as well) connecting to the tado bridge. But as all these other devices connect correctly to the Home Hubs I believe that the problem must lie within the implementation of the IP stack on the tado.

I would like to work with tado to get this fault resolved. Looking at the above errors it seems to be that either the tado bridge in general (all of them) or my specific one does not like the BT Home Hub series of routers (as a final test I took the tado bridge to my fathers house who also has a Home Hub 5 and it also could not get an IP address there).

Ideally I would like some way of setting a static IP on the tado, that would take DHCP out of the equation, if that is not possible maybe a way to have the tado log so that we could see what the tado is attempting to do (but as there does not appear to be an interface on the tado to look at I don’t know how this would be possible).

If tado can’t work with me to resolve I’ll have to take advantage of their no quibble 30 day money back guarantee and be left with old heating controls…… 

Lync 2010 Edge Certificates expiring. What does it look like?

I was at home and logging into my Surface Pro 2 and my desktop Lync client was not signing in - but also not coming back with any errors. On further investigation the event log had the following error:



We are running Lync Sever 2010 but I am running Lync 2013 (Office Pro Plus) as my client. Lync Server 2010 has never had that DNS name so I knew that the problem was different to what the event log was showing. As I was external (Direct Access on the laptop but as per best practice Lync runs outside the VPN) the first place to look was the Lync Edge server.

Logging onto the Edge server the first place to look is the event log and the following was a glaring problem:



As there had been no DNS, Firewall or Proxy changes that only left credentials. I connected into the Front End server and the following two errors give some big clues:


 

So it appears that a certificate has expired (as a side note we use the DigiCert Discovery Tool - you need a DigiCert account - to check for any certificates on the estate that are going to expire, the reason that this one was not picked up was because even thought we where scanning the Edge server we were not checking port 4443 this has now been added as a change) Anyway, checking the certs on the Edge Server with the command Get-CSCertitificate gave the following:



Both Internal and AudioVideoAuthentication have expired, next checking the certificates in the computer personal store we can see the following:

 


This is showing two certificates that have expired and ties into what PowerShell is telling us (for a good pointer of what you need on the Edge check Jeff Schertz's Blog: http://blog.schertz.name/2012/07/lync-edge-server-best-practices/).

So, simple fix. Renew the certificates using the Enterprise CA and then assign (I’m not going to document how as again Jeff has done a great job of this here: http://blog.schertz.name/2012/01/simple-certificate-requests-in-lync/).

While I was here I thought I might as well tidy up the old Root CA that the Edge Server had imported so deleted that - the new certificates don’t use it so what’s the harm........

.........This proved to be a mistake, even though the Edge server didn’t host any certificates that needed the old Root CA there were some certificates on the Front End servers that couldn’t be verified as they had been signed by the previous Root CA certificate, this can be seen here:

Simply downloading the old Root CA Cert from the Enterprise Root CA (https://<certsrv>/certcarc.asp) and importing it into the Edge server made the certificates being presented by the Front End servers immediately valid and my remote Lync Client could finally login!

My lesson learned, don’t “tidy” until you have fixed the underlying problem!


Lync 2010 user getting locked out

Many posts about this on the internet but having been through this today with a customer think I have it cracked.

  1. Exit Lync on users desktop
  2. Unlock user in AD
  3. Remove user certificate in Lync Control Panel
  4. Remove local machine Lync user certificate
  5. Remove cached credentials from users PC
  6. Delete contents of C:\Users\%username%\AppData\Local\Microsoft\Communicator
  7. Delete reg key HKEY_CURRENT_USER\Software\Microsoft\Communicator
  8. Load Lync

If Lync prompts for a username and password then you still have the problem and at this point dissolve in tears.

HELP: My application is not on screen OR Multiple to Single monitor pain.

Silly little post here but imagine you have a laptop that you use in your docking station. The docking station has two screens, and you run Application X on screen two (not your primary screen).

Now when you undock Windows notices and puts all applications onto your single primary screen (the laptop screen).

When you re-dock Windows again notices and says "Ahhhh - Bob wants Application X on screen two" and all is well with the world.

One day Bob can't sit at his usual desk and instead docks onto a different but identical docking station which only has one screen. He the loads Application X

Windows notices that we are at a docking station that it recognises but doesn't appear to understand that there is only a single screen and as such loads Application X onto a non-existent screen two.

To move it back:
 
1) Give Application X focus
2) Press ALT + Space together










If Application X is full screen on screen two then select Restore then redo this step

3) Use the cursor keys to move down to move and press Enter
4) Press a direction on the cursor key to make Application X stick to the mouse
5) Move the mouse to drag Application X onto the working screen

How much Azure storage am I using for my DPM backup

Looking in your Data Protection Manager console you see that you are using X amount:

 
You then login to the Azure Dashboard and see that the same amount is being reported:


Switching to your Azure Account details you seen a different amount:

 
 
The reason is because of the amount of de-duplication that the Azure Cloud performs as detailed here:

Will the number of GB I am charged for be exactly the same as the amount of data I am backing up from my on-premises server?
No. There are several factors that will impact the amount of storage you are consuming in the Backup service including, but not limited to, compression ratios, the rate at which the data changes, and the number of backup copies you elect to retain in the service.

Cochlear Implant Processor upgrades on the NHS

This blog post is mainly for myself to use for future reference but hopefully it will be useful for any parents out the looking for information on the "rules" for when upgrades should be performed.

Map of Medicine - Paediatric cochlear implant - postsurgery
original file: http://hearing.screening.nhs.uk/getdata.php?id=23891
Mirror: https://db.tt/yliGfcxi

Information of use:
"20  Provide device upgrade when appropriate  
Quick info: Provide device upgrade (internal or external) when appropriate:
• cochlear implant centre to review:
           -patient records
           -staffing
           -training; and
           -financial implications
• cochlear implant centre to develop and roll out a plan with manufacturers, that includes:
           -training of staff
           -updating of programming software
           -availability of stocks and spares
           -availability of funds for new processors
• provide appointments for upgrading processors
• follow programming protocol
• provide reports to:
           -local professionals
           -referrer
           -general practitioner; and
           -family
• consider sequential bilateral"

D09/S/a - NHS STANDARD CONTRACT FOR EAR SURGERY: COCHLEAR IMPLANTS (ALL AGES)
original file: http://www.england.nhs.uk/wp-content/uploads/2013/06/d09-ear-surg-coch-impl.pdf
http://www.england.nhs.uk/wp-content/uploads/2014/04/d09-ear-surg-coch-0414.pdf
Mirror: https://db.tt/NwAcxYQ7
https://www.dropbox.com/s/ufvkcam53jq0xs0/d09-ear-surg-coch-0414.pdf

Information of use:
"Upgrade or provision of new sound processors on average at 5 yearly intervals, where available, in order to ensure patient access to up to date technology to maximise their hearing performance and subsequently outcome from the intervention."

System Center Data Protection Manager backup to Windows Azure

At %dayjob% we have been using Microsoft Data Protection Manager since 2007 and we are currently at DPM 2012 SP1 level (not the new R2 release).

One of the new features is being able to backup to Windows Azure as an option. We had looked at cloud backup in the past with Iron Mountain but at the time the pricing was prohibitive. Now with Azure we can store 5Gb a month in the cloud for free so it was worth dipping our toes in.

First things first is to create and account on the Windows Azure website – this takes all of 5 minutes and after handing over credit card details for any data over the 5Gb I was away.

Next step is to create the certificate that we are going to use to validate that the DPM server is trusted by Azure. The documentation goes on about using MakeCert.exe from the Windows SDK but as we have a Domain Certificate Authority I decided to try to use that instead. The problem was there appears to be no information from Microsoft on how to achieve this – in fact all documentation from Microsoft about getting Azure to connect to on-prem stuff is very poor IMHO.

First we have to create a certificate template that matches what you need according to the documentation:
http://msdn.microsoft.com/en-us/library/dn169036.aspx
http://www.microsoft.com/en-us/download/details.aspx?id=34608

• The certificate should be an x.509 v3 certificate.
• The key length should be at least 2048 bits.
• The certificate must have a valid ClientAuthentication EKU.
• The certificate must be currently valid with a validity period that does not exceed three years. You must specify an expiry date; otherwise, a default setting that is valid for more than three years will be used.

First thing was to create a certificate template that fits the above requirements. To do this I connect to our Domain CA and opened the Certificate Templates Console.

Within here I duplicated the Web Server template. And hit my first stumbling block. When duplicating the following screen comes up:



And being a modern man I though – lets use Windows Server 2008…..

That was a mistake, later when I was attempting to use the that certificate  I generated to connect to Azure I was getting errors that the certificate specified was not associated with any backup vaults:



After checking the Agent logs in: Program Files\Windows Azure Backup Agent\Temp\ CBEngineCurr.errlog
I saw the following line:

WARNING --->System.Security.Cryptography.CryptographicException: Invalid provider type specified.

This turns out to be a problem if the software (I’m guessing the Agent) can not understand the newer CA versions (http://serverfault.com/questions/475525/the-private-key-for-the-certificate-that-was-configured-could-not-be-accessed)

So at this point – choose Windows Server 2003 Enterprise.

Give the new Template a name and make a note of it – you’ll need this later.
Choose to Publish the Certificate in Active Directory




Under Extensions -> Application Policies add in client authentication (http://social.technet.microsoft.com/Forums/windowsserver/en-US/0e039144-1cf2-4370-a0a8-0f4e8ca4aff4/problem-issuing-web-server-certificate-with-enhanced-key-usage?forum=winserversecurity)



At this point you want to take a walk, or do something so that the Template has time to get replicated into Active Directory. Make a coffee/leave for the weekend this will all depend on the size of your Active Directory estate.

Now on the DPM server we want to create a certificate. To do this we are going to use certreq  (http://technet.microsoft.com/library/cc725793.aspx)

I created a request.inf file with the following parameters:

[NewRequest]
Subject = "CN=SERVERNAME.DOMAIN.local"
ExportableEncrypted = TRUE
KeyLength = 2048
[RequestAttributes]
CertificateTemplate="DPMCertificate"

Note that the Certificate Template to use is the one I told you to make a note of earlier.

Now from a command line: certreq –new
And select the inf file you created earlier
(if at this point you get an error “Template not found.  Do you wish to continue anyway?” then either your template name is wrong or its not yet available in the certificate authority.)
Save the resulting request file.

From the command line type: certreq –submit
Select your certificate authority (if applicable) and then save the resulting Certificate file.

This Certificate file is what we need to submit to Azure so remember where it is saved.

We might as well upload the certificate into the local machine personal store now so it appears here:




I now switched back to my Azure account and started to provision my cloud storage, this part I had to research a little to find which would be the best region to place the store at.



After a bit of Binging I choose North Europe based on http://www.robblackwell.org.uk/2011/04/12/azure-northern-europe-is-dublin-and-western-europe-is-amsterdam.html (as an aside you can see your fastest connection at a point in time by using http://azurespeedtest.azurewebsites.net/)

Click Create Vault.


After a bit of flashing, whizzing and popping we get a new backup vault under recovery services



Clicking on the vault name you created earlier takes you to this page where you can upload the certificate (Manage Certificate):



We get another nice flashy upload graphic



If you certificate is invalid for some reason you’ll get an error which will help you to correct the problem and create a new one (I saw this a lot!):



Once you have a good certificate:




Now to download the DPM agent:


The agent can be downloaded from: http://go.microsoft.com/fwlink/?LinkId=288905

Now for some install screens (I'm sure if you've read this far you know how to click Next and Finish)
 



 









 
 
Once the Agent is installed and patched to the latest version (Windows Update). Then you can go on with the configuration within the DPM Console, click Online and then Register:



Click the browse button and any valid certificates are shown – select the one that corresponds to the one you uploaded (if need be compare the thumbprints)





Once the certificate has been compared to other certificates on Azure then your associated backup vault should then become visible:

(if at this stage you are getting errors then it could be proxy authentication  - check the agent error log again).

Now we get the chance to add in a proxy server (hey Microsoft – how about you do this earlier so we don’t get proxy authentication issues!)

Much like when we are setting up servers to backup we get the chance to choose how much bandwidth we give over to DPM:

We now need to choose where restores will go if/when we want to restore from Azure:

At this stage you create the Passphrase that DPM will encrypt your backup with before it is sent to the cloud. You can click generate passphrase and then Microsoft will helpfully give you a nice 36 digit GUID to use or you can generate your own by mashing the keyboard!
 

Success:
 
 
You now need to add online protection to a supported datasource within DPM and perfom a cloud backup. Once the backup is complete you can see from the Azure Management Portal the amount of data being held:
 
This information is also available in the DPM console:
 
I hope that someone finds this information helpful, the pricing of Azure Storage makes this a very attractive option for having an cold offsite backup and I look forward to more DPM workloads being supported in the future (hint SharePoint).
 
Comments as always are very welcome.
 
Update 22/10/2014: You no longer need to do the certificate creation according to this document: http://azure.microsoft.com/blog/2014/09/11/getting-started-with-azure-backup