Tip For Working From Home

Some people have the fortune of being able to work from home as part of their employment. Working from home, or telecommuting, can be extremely productive for a lot of people. It can also be counter productive if you’re not experienced at telecommuting. As a regular home worker, I have learned a lot of ways to make the valuable, non-interrupted time as productive as possible.

Routine

It’s important to keep your usual morning routine (apart from the travelling to the office part, obviously) to help switch your brain to work mode. Lounging around in your dressing gown with your laptop is something you can do on a Sunday morning, and you’ll struggle to associate the time with work.

My normal morning routine looks something like this:

  • 6:00 – Alarm goes off.
  • 6:05 – Have a cuppa and breakfast with the wife and kids.
  • 6:30 – Shepherd the kids into their uniforms for school.
  • 7:00 – Get a shower and get dressed for work.
  • 7:20 – Shepherd the kids into coats and shoes ready to leave.
  • 7:30 – Leave to drop the kids off at various locations.
  • 7:50 – Leave the hometown en route to work.
  • 8:20 – Arrive at work.

My “working from home” routine is pretty much the sames:

  • 6:00 – Alarm goes off.
  • 6:05 – Have a cuppa and breakfast with the wife and kids.
  • 6:30 – Shepherd the kids into their uniforms for school.
  • 7:00 – Get a shower and get dressed for work.
  • 7:20 – Shepherd the kids into coats and shoes ready to leave.
  • 7:30 – Leave to drop the kids off at various locations.
  • 7:55 – Arrive back home ready for a day of work.

Since I usually wear a work branded polo shirt and jeans for work, I usually just wear them when working from home. Your mileage may vary if you have to wear a suite though.

Workspace

It’s important to have somewhere to work from while telecommuting. Sitting on your sofa with your laptop might seem like a great idea, but take it from me, you will soon get too comfortable and start slouching. Then you start to become distracted and turn the TV on. This is fine if you can work like that, but I bet your productivity is affected.

The best scenario is having a desk and a comfortable office chair to sit in, preferably in a different room to the hustle and bustle of your house, if there are other family members home. If you don’t have a desk, the dining table is a good alternative, providing enough space to spread your work out a little and key coffee cup support within arms reach.

Connectivity

The vast majority of work is performed on laptops these days. Unless you are simply writing a proposal and you don’t need internet connectivity, then you need to think about connectivity.

In some cases, a straight WiFi connection will suffice for access to emails and the internet, but you might need to consider access to internal business systems. Since each company is different, contact you ICT department to find out what options are available to connect to the resources you require for your job.

The company I work for offer a device called a Teleworkers Gateway. This device plugs into my own router and gives me the options of either a “work” WiFi connection, or a cabled Ethernet connection. They also offer a software VPN solution but the Teleworkers Gateway is the simpler and more convenient option.

Welfare

The beauty of working from home is having unlimited coffee without feeling obliged to make one for everybody else in your office each time. Just make sure you stop for lunch at your normal time.

Another benefit of working from home is your bed. Seriously. Feeling a bit burnt out by lunchtime? Have a 45 minute nap. You’ll be nice and refreshed and more productive in the afternoon.

Moderation

Like all good things, telecommuting should be used in moderation if it isn’t your primary working arrangement. Working from home is great when you need to focus on a particular task without the interruptions of the office. Do it too much though, and people get used to you not being in the office and start calling, emailing and instant messaging you more, thus increasing the interruptions you were trying to avoid.

Meters Music OV-1 – My Opinion

For anybody that has been living under a rock since CES 2017, there’s a new kid on the block when it comes to prosumer headphones. The Meters Music OV-1 by Ashdown Engineering.

I recently purchased a set of these headphones after watching countless interviews with Ashdown at CES, and reading some reviews of the headphones by tech websites. The reviews weren’t all 100% but that’s expected in this day and age, and I tend to listen to different styles of music than the reviewers, so I decided to give them a try all the same.

Packaging

Opening the packaging wasn’t awe-inspiring like some high-end products I receive, but it was a pleasant experience all the same. The inclusion of two cables, one including an inline mic and one not, was a nice touch and they were decently built cables. The included Micro USB for charging on the other hand was just some generic cable you can probably buy from eBay for a few pence.

The headphones themselves were in the included hard case but lacked any additional packaging. The inclusion of some sticky protective plastic on the alloy cups would have been nice, and would have probably prevented the small scuff mark on the left cup. It looks like the box might have been dropped in transit and the holder for the cup has hit the cup. It’s hard to see under normal use, but I know it’s there!

OV-1 Scuff mark on cup.jpg

Aesthetic & Build Quality

The headphones are pleasing on the eye with good industrial design, and the inclusion of the VU line meters on the ear cups is cool but a bit of a gimmick. The overall look of the sweeping, single mount cup holder, coupled with the stylish, albeit plastic, hinge mechanism is a well designed and engineered look for these alloy headphones. The transition from alloy to protein leather, with its neatly stitched seems, is also well executed.

The biggest let down from an aesthetic perspective is the use of black plastic on the hinge and hangar mechanism. even just colour coding the plastic to the alloy would have served their vision better in my opinion.

The headphones feel well constructed though and I can’t really knock the build quality other than the slightly over tightened hex screw holding the left ear cup in, which I had to file a couple of burrs off to appease my OCD.

IMG_1907.jpg

A bit more consideration of the positioning of the three-way switch (EQ, Passive, Active Noise Cancelling) would have been nice. It is currently located under the ear cup holder, meaning you have to remove the headphones to change the mode. I feel it would have made the experience just that little bit more pleasing if it had been somewhere a little easier to reach during use.

Comfort

From a comfort perspective, they take a bit of adjustment and getting used to, but after a couple of days you barely even notice they are on your head. During the initial honeymoon period where they were on and off my head every few minutes for adjustment, the clamping force of the sprung steel headband was a bit overwhelming and didn’t ease any until I had forced the headband apart a few times over the course of a day.

The headband also applied pressure to the top of my head until I found the correct hight for the height adjustable cup sliders. Unfortunately the mechanism on the sliders is too week and the cups tend to lose their position whenever I stretch the headphones enough to put them over my head. This is annoying and needs to be fixed in any future release.

IMG_1906.jpg

To give Meters their due, the fake “protein” leather on the ear cups is comfortable, but isn’t sweat proof as claimed. However the sweat is a lot less noticeable to the point it didn’t bother me like it does on other headphones.

Noise Cancellation

In the interest of full disclosure, I am yet to try the noise cancelling feature (ANS) on any mode of transport, which is the main use case for such technology. I have, however, tried it out in some server rooms where the constant drone of server fans are drowned out to the point of being almost silent while using the headphones without an audio source connected. The slight hiss of the ANS system doing it’s thing is just audible though, until I start playing some music, of course.

All in all I’m happy with the ANS system and would even stretch to call it the best I have used yet, although I’ve probably only experienced a total of five other noise cancelling systems in the past.

Sound Quality

I’m no audiophile, but I do appreciate great sounding music. I also listen to a wide variety of music so I’ve put the headphones though their paces with a few genres, from Heavy Metal to UK Hardcore and everything in between.

The headphones sound great and have decent volume when listening in passive mode (or off). They offer crisp high ends, prominent mid range and deep bass. Switching the ANS on reduces volume somewhat, and dampens the high ends a little but still maintains a decent sound quality with adequate volume.

EQ mode, the final sound option, sounds terrible. It’s pointless beating around the bush, the bottom end is far too overpowering and crushes the high-end. I can’t find a single song that sounds good while using the mode so I just don’t use it. I seriously wish the headphones allowed the user to tune and tweak the EQ setting using the USB connection.

Summary

All things considered, I like the OV-1 headphones. They are the best sounding headphones I have used (apart from the EQ setting) and they are extremely comfortable for prolonged use at work. As a developer I often spend hours on end listening to music to drown out the bustle of the office. They also give a bit more depth to podcasts and significantly improve the usually mediocre sound quality of them.

I’m still not sure If I would wear them in public though. Maybe if Meters allowed you to control whether the VU meters were on or off independently of the noise cancelling feature I might be more inclined to, but as they stand, they could look a bit obnoxious in certain environments.

In conclusion, if you are an audiophile, don’t bother. If you are a run of the mill prosumer who likes gadgets and great sounding music, then these headphones are good bang for your buck. Plus they don’t feed more of you hard-earned cash into Apple’s coffers.

SharePoint 2013 or 2016 and ADFS

Ever wondered how to configure SharePoint to use ADFS for user authentication? Googled it and found it confusing? Me too! Don’t despare though… the Powershell is pretty straightforward and it only gets easier the more often you do it…

Export ADFS Signing Certificate

First of all log in to the ADFS server and export the signing certificate.  The following powershell should be ran as administrator and will export the certificate to c:\ADFSSigning.cer.


$certBytes=(Get-AdfsCertificate -CertificateType Token-Signing)[0].Certificate.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert)

[System.IO.File]::WriteAllBytes("c:\ADFSSigning.cer", $certBytes)

If your ADFS signing certificate was issued by a certificate authority and not self-signed by ADFS, you must ensure the entire certificate chain is trusted by SharePoint as well. I won’t cover this process here, but you can refer to another post on the topic here.

Add ADFS Relying Party Trust

While you are on your ADFS server, you may as well create the relying party trust in, you guessed it, powershell. But first you need to make a txt file with the following contents. For ease, lets say c:\rules.txt. These are the transformation rules for the relying party. I find this is all that is really required to start with as User Profile Sync will grab the rest.


@RuleTemplate = "PassThroughClaims"
@RuleName = "SharePoint Attributes"
c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY"]
=> issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress", "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn", "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname", "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname"), query = ";mail,mail,sn,givenName;{0}", param = c.Value);

Then edit the variables in the powershell below and execute it. Here a quick explanation of the variables.
$rules – The path to the rules.txt files you have just created.
$name – The name of the relying party trust.
$urn – If you don’t know what this is, just leave it.
$webapp – The URL for the first web application you are going to use. I’ll show you how to add another application later. Don’t put a trailing slash on the URL.

$rules = "c:\rules.txt"
$name = "SharePoint Site 1"
$urn = "urn:sharepoint:site1"
$webapp = "https://site1.domain.local"
$endpoint = $webapp + "/_trust/"
[string[]] $urnCollection = $urn, $webapp

Add-AdfsRelyingPartyTrust -Name $name -ProtocolProfile WSFederation -WSFedEndpoint $endpoint -Identifier $urnCollection -IssuanceTransformRulesFile $rules

You can now go and check in the ADFS console and yor new trust should be listed under Relying Party Trusts.

Add Token Signing Certificate to SharePoint

Log in to the SharePoint server that hosts central admin and copy the ADFSSigning.cer file to the C drive then open the SharePoint Management Shell as administrator. The following powershell will import the certificate so that SharePoint trusts it.


$path = “C:\ADFSSigning.cer”
$root = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2($path)
New-SPTrustedRootAuthority -Name "ADFS Token Signing Cert" -Certificate $cert
//Keep this window open for the next step

Again, of your ADFS signing certificate was issued by a Certificate Authority instead of being self-signed by ADFS, you must make sure SharePoint trusts all other certificates in the chain.

Create The Authentication Provider In SharePoint

To add ADFS as a Authentication Provider to SharePoint, use the following powershell in the same windows that you imported the certificate in:

//Map the email address, UPN, Group Memberships and SID from ADFS
$emailClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "EmailAddress" -SameAsIncoming
$upnClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn" -IncomingClaimTypeDisplayName "UPN" -SameAsIncoming
$roleClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" -SameAsIncoming
$roleClaimMap = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" -SameAsIncoming

//Update the following to match the details entered earlier if you changed them
$realm = "urn:sharepoint:site1"
$signInURL = "https://adfs.domain.local/adfs/ls"
$ap = New-SPTrustedIdentityTokenIssuer -Name ADFS -Description ADFS -realm $realm -ImportTrustCertificate $cert -ClaimsMappings $emailClaimMap,$upnClaimMap,$roleClaimMap,$sidClaimMap -SignInUrl $signInURL -IdentifierClaim $emailClaimmap.InputClaimType

Providing you don’t get any error, SharePoint should now be able to use ADFS as a authentication provider.

Change SharePoint Application to Claims Based

Now you need to make one of your SharePoint web applications use ADFS for authentication. There are a number of caveats with this though:

  • You will lose access to the web application as your permission is set on the account SharePoint knows of from windows authentication. I’ll show you how to fix this soon.
  • Your user profile will not be associated with your account any more. This is again because the User Profile Service has synced your profile with Active Directory using your windows account.
  • Search will be unable to crawl the SharePoint site as it doesn’t support claims based authentication.

I’ll show you how to fix these things in a future post, otherwise this post will end up being a monster.

For now, I’d recommend performing these steps on a newly created SharePoint Web Application of your choosing. The commands assume a site called https://site1.domain.local.

Open up the SharePoint management shell as administrator and run the following commands:

$webApp = Get-SPWebApplication -Identity "https://site1.domain.local"
$sts = Get-SPTrustedIdentityTokenIssuer "ADFS"
Set-SPWebApplication -Identity $webApp -AuthenticationProvider $sts -Zone "Default"

If you now try to access the site you should be redirected to your ADFS sign-in page. Once you login, you will probably get the “This site hasn’t been shared with you” message. Keep reading for a fix.

Changing More SharePoint Applications to Claims Based

The above powershell will certainly allow you to change another web application across to claims based authentication, but with one little issue. Upon doing so, every time you try to access the second web application you will end up back at the first one after you login. This is because the realm of the second web app is different, and ADFS will just send you straight to the first site configured. Luckily thought there is no need to mess on with certificates for the second site.

On the SharePoint server, open the SharePoint Management Shell as admin and use the following commands to add another realm to the authentication provider:


$urn = "urn:sharepoint:site2"

$ap = Get-SPTrustedIdentityTokenIssuer
$uri = new-object System.Uri("https://site2.domain.local")
$ap.ProviderRealms.Add($uri, $urn)
$ap.Update()

Then you need to add another Relying Party Trust in ADFS to handle requests for the second SharePoint site. To do this, follow the steps in the section of this post Add ADFS Relying Party Trust, but remember to update the strings in the powershell commands to represent your second site.

Regain Access to Site With Claims Based Authentication

Now that your site(s) are claims based authentication enabled, you need to re-add yourself as a site collection administrator. The following powershell will set your ADFS formatted account as the secondary site collection owner. You can also do this in central admin but I won’t go into that route in this post.


Set-SPSite –Identity "https://site1.domain.local" –SecondaryOwnerAlias "emailaddress@domain.local"

Updating the Signing Certificate in SharePoint

For whatever reason Microsoft hasn’t given, SharePoint can’t use the Federation Metadata issued by ADFS to update the Signing Certificate when it is renewed at the end of its validity period, leaving it up to the administrator to do this manually.

The process of updating the certificate isn’t particularly complex. It’s basically export the new certificate, then install it and import it on the SharePoint server before updating the Trusted Token Issuer in SharePoint to use the new certificate. The problem is though that if you forget to update the certificate in the brief period between the ADFS server renewing it’s certificate and the old certificate expiring, nobody will be able to login to SharePoint. And trust me, you WILL forget at least once.

Fortunately, Jesus Fernandez has a solution over on MSDN in the form of a powershell script that can be scheduled to run on the SharePoint server. The script reads the afor mentioned Federation Metadata from ADFS and downloads the current token signing certificate. If it is different to the one SharePoint is using, it adds it to SharePoint and updates the Token Issuer in SharePoint to use the new certificate. Nifty huh? I’d strongly recommend this as a solid option.

Workflow Manager Configuration Error

I recently attempted to configure Workflow Manager 1.0 and Service Bus 1.0 for use by SharePoint 2016, using a certificate issued by our domain CA instead of self-generated certificates. I ran into the following error though.

System.Management.Automation.CmdletInvocationException: Could not successfully send message to scope ‘/WF_Management’ despite multiple retires over a timespan of 00:02:07.8300000.. The exception of the last retry is: A recoverable error occurred while interacting with Service Bus. Recreate the communication objects and retry the operation. For more details, see the inner exception..  —> System.TimeoutException: Could not successfully send message to scope ‘/WF_Management’ despite multiple retries over a timespan of 00:02:07.8300000.. The exception of the last retry is: A recoverable error occurred while interacting with Service Bus. Recreate the communication objects and retry the operation. For more details, see the inner exception..  —> System.OperationCanceledException: A recoverable error occurred while interacting with Service Bus. Recreate the communication objects and retry the operation. For more details, see the inner exception. —> Microsoft.ServiceBus.Messaging.MessagingCommunicationException: Identity check failed for outgoing message. The expected DNS identity of the remote endpoint was ‘WFMServer1.contoso.com’ but the remote endpoint provided DNS claim ‘WFMServer3.contoso.com’. If this is a legitimate remote endpoint, you can fix the problem by explicitly specifying DNS identity ‘WFMServer3.contoso.com’ as the Identity property of EndpointAddress when creating channel proxy.  —> System.ServiceModel.Security.MessageSecurityException: Identity check failed for outgoing message. The expected DNS identity of the remote endpoint was ‘WFMServer1.contoso.com’ but the remote endpoint provided DNS claim ‘WFMServer3.contoso.com’. If this is a legitimate remote endpoint, you can fix the problem by explicitly specifying DNS identity ‘WFMServer3.contoso.com’ as the Identity property of EndpointAddress when creating channel proxy.

To cut a long troubleshooting story short, the problem was with the certificate I had requested from the CA. More specifically the DNS extension. I had the following DNS entries in the certificate.

  1. *.contoso.com
  2. WFMServer1.contoso.com
  3. WFMServer2.contoso.com
  4. WFMServer3.contoso.com

For what ever reason, WFM only seems to look at the final DNS entry when trying to add the host to the WFM farm. To confirm this, I tried the installation from all three hosts and it worked fine on WFMServer3.contoso.com, but not the other 2.

I’m still not entirely sure if it was WFM or SB that was causing this issue, but I fixed it by simply revoking the certificate on our CA and re-installing SB and WFM using a certificate with wfm.contoso.com as the Common Name and DNS entries in the following order:

  1. WFMServer1.contoso.com
  2. WFMServer2.contoso.com
  3. WFMServer3.contoso.com
  4. *.contoso.com

 

Controlling the Startech VS421HDPIP with C#

As part of a project I have been working on to spec and install a video conferencing solution I purchased a couple of Startech VS421HDPIP units to switch between camera inputs. I chose this unit for a couple of reasons; it has enough inputs for the project, it’s capable of doing picture in picture, it can be controlled via telnet or RS232, our supplier could get it for half the RRP and it’s made by Startech, who’s kit has never left us in an awkward situation.

When I first hooked it up it just worked and the presets that were built-in were decent. Unfortunately it didn’t have presets to support the use cases I required, but I expected that from reading the manual prior to purchasing them. The presets can be changed from the web interface, which is the only downside to this unit… its utter pants. Fortunately it supports some pretty advanced commands via telnet so all is not lost. After playing with the telnet prompt for a while I figured that I could probably write some form of application to change the video output layout from a PC, which would certainly make it more user-friendly. The VC solution is going to be installed in a remote office and anything to make operation easier for the end users is a big plus. This will also allow me to have basically unlimited “presets” to cover more use cases without reconfiguration. Bonus.

Driver

I decided that the application should probably allow control of the PTZ cameras too to encompass the entire VC solution in one control panel, so a simple “driver” style API would be the best bet for controlling each component. The driver for the Startech VS421HDPIP only took two hours to write, including an application to test the class, and is incredibly simple to use. It is available to download at the end of this post. Here’s an example of how to use it.

//create a new instance of the VS421HDPIP class.
VS421HDPIP Conn = new VS421HDPIP();

//set the unit IP address and port
Conn.Ip = IPAddress.Parse(“192.168.1.6”;
Conn.Port = 23;

//open the connection to the unit.
Conn.Open();

//execute commands. See list of commands in the accompanying blog post.
Conn.PowerOn();  //Power on the unit from standby
Conn.Recall(VS421HDPIP.WindowState.Fav1);  //Recall the Fav.1 Preset
Conn.SetImagePosition(VS421HDPIP.InputChannel.Input2, 1420, 790); //Sets the possition of channel 2.

//always close the connection when done. The unit only allows one session at a time and not closing a session will result in you having to reboot the unit or waiting for the session timeout on the unit before opening a new connection.
Conn.Close();

Unfortunately the telnet prompt doesn’t let you retrieve information. Maybe this is something Startech will add to future models but it is what it is.

Commands

The commands I implemented are basically identical to the commands listed in the manual for the VS421HDPIP. I won’t go over available parameters for each here. The parameters are similar to those listed in the manual as well so I’m sure they’d be easy to figure out with intellisense.

  • PowerOn() – Power on the unit from standby.
  • PowerOff() – Place the unit in standby.
  • SetOutResolution(Resolution res) – Sets the output resolution.
  • OsdDisable() – Disable the On Screen Display.
  • OsdEnable() – Enable the On Screen Display.
  • OsdHOffset(int offset) – Sets the horizontal offset of the OSD.
  • OsdVOffset(int offset) – Sets the verticle offset of the OSD.
  • OsdTimeout(int timeout) – Sets the timeout of the OSD.
  • OsdGain(int gain) – Sets the gain of the OSD.
  • SetBrightness(InputChannel chan, int level) – Sets the brightness level of an input.
  • SetContrast(InputChannel chan, int level) – Sets the contrast level of an input.
  • SetSaturation(InputChannel chan, int level) – Sets the saturation level of an input.
  • SetHue(InputChannel chan, int level) – Set the hue level of an input.
  • Mute() – Mutes the audio output of the unit.
  • Unmute() – Unmutes the audio output of the unit.
  • SetImageSize(InputChannel chan, int h, int v) – Sets the size of a channel.
  • SetImagePosition(InputChannel chan, int h, int v) – Sets the position of a channel.
  • ChannelImageOn(InputChannel chan) – Turns a channel on.
  • ChannelImageOff(InputChannel chan) – Turns a channel off.
  • ChannelPriority(InputChannel chan, int priority) – Sets a channel’s priority.
  • SetChannelLabel(InputChannel chan, string label) – Sets a channel’s label.
  • StoreCurrentConfiguration(Favorite loc) – Stores current settings to a preset.
  • MirrorOn() – Makes the unit mirror it’s output (rear projection).
  • MirrorOff() – Disables the output mirror feature.
  • SetOutputRotation(Rotation value) – Rotate unit output (ceiling hung projector).
  • SetFadeDuration(Fadetime time) – Sets fadetime between windows.
  • SendTelnetCommand(string command) – Send a telnet command to the unit.

Download

I have provided the driver I wrote in case it is useful to anybody. I do so with absolutely no warranty or support and you use it at your own risk.

Compiled DLL
Visual Studio Project Files