Monday 13 July 2020

Moved

Blogger's editor sucks.

As a result, I've copied all old posts to Github pages where I can author everything in markdown.

https://alanparr.github.io/

All new posts will be at the above url.

Friday 10 July 2020

Supporting SameSite None in .Net 4.6 or lower.

As I write this post, it is 4 days until Chromium begins enforcing the new SameSite rules again.
When they first did this in March, it caused a number of issues including breaking website integrations with some payment gateways.
If you're on .Net 4.7 or higher, Microsoft supports setting SameSite to None. The official recommendation is that if you want to use SameSite None, then you need to move up to .Net 4.7.2, which if you are able, you should absolutely do.
However, there are those of us who are stuck on .Net lower than 4.7 and there is nothing we can do about it and our employers want to know that their sites aren't going to start breaking come the 14th of July.
While trying to find a solution to this problem, I stumbled upon what appears to be a possible solution for those of us stuck on lower .Net versions.
var cookie = new HttpCookie("myreallyimportantcookie")
            {
                Value = "myreallyimportantcookievalue",
                Secure = true,
                Path = "/",
                HttpOnly = true
            };
As you'll see from the below image, we have a cookie with the secure attribute and httponly, but no samesite attribute.

Adding SameSite

In .Net 4.7.2, if we want to support SameSite, we simply add the SameSite attribute.
var cookie = new HttpCookie("myreallyimportantcookie")
            {
                Value = "myreallyimportantcookievalue",
                Secure = true,
                Path = "/",
                HttpOnly = true, 
                SameSite = SameSiteMode.None
            };
Of course, we can't do this in .Net 4.5 as the SameSite property doesn't exist. Instead, we can do this somewhat gross thing:
var cookie = new HttpCookie("myreallyimportantcookie")
            {
                Value = "myreallyimportantcookievalue" + ";SameSite=None",
                Secure = true,
                Path = "/",
                HttpOnly = true
            };
And now if we run the application, we can see we have the SameSite attribute set to None.

Disclaimer #1

This solution is completely unsupported and a bit gross. The approved solution is to move to .Net 4.7.2 and if you are able, you should absolutely do that. But sometimes the real world places limits upon us and if you are in that situation, hopefully this will get you out of a bind.

Disclaimer #2

I have only tested this in an MVC solution in .Net 4.6.1. Theoretically, I think it will probably work in versions lower than that unless the cookie value-handling semantics changed massively at some point in the past. If you want to use this, test for yourself and post a comment if it worked as maybe that will help someone else.

Saturday 13 June 2020

API Head-to-head Update : AWS S3 Vs Windows Azure Table Storage Vs Rackspace Cloud Files

This is an update to my last API head to head from August 2014, I'm nothing if not consistent with my inconsistent posting. I've recently changed jobs to a new company that is moving to Azure but has some legacy Rackspace assets, so I thought it'd be fun to redo the test with Rackspace added. Worth noting that the Rackspace support for C# is completely non-existent. The official Rackspace SDK hasn't been updated since 2013 and this test is using the openstack.net SDK which hasn't been updated since 2018.

The code

Initialise Provider

CloudIdentity cloudIdentity = new CloudIdentity()
{
 APIKey = "mykey",
 Username = "myusername"
};
var provider = new CloudFilesProvider(cloudIdentity);

Create container

provider.CreateContainer(containerName);

Upload file

provider.CreateObjectFromFile(containerName, testFile, blobName);

List Blobs (objects in Rackspace parlance)

provider.ListObjects(containerName).ToList();

Delete file

provider.DeleteObject(containerName, blobName);

Delete container

provider.DeleteContainer(containerName);
Can't argue with the simplicity of the code, once you've initialised the provider, everything is one line.

The results

Worth noting that I don't have the original test file, so the file being uploaded here is one I happened to have lying around of this guy. It is only 1k larger, so don't expect it will have a massive effect on the results.
Operation S3 Azure Rackspace
Create Container8476681915
Upload 7Kb file8392230
List Blobs (1)4017251
Delete Blob4735112
Delete Container2404568
I ran the tests a few times and the numbers fluctuated but the positions pretty much stayed the same, except the Upload file winner switched between S3 and Azure.

Conclusion

Rackspace was slowest, but that isn't terribly surprising considering I was using a third-party 2-year old library, it isn't necessarily a realistic comparison, just for my own amusement.

Sunday 8 January 2017

Converting docx to PDF in Azure

As part of a recent feature, we needed to implement conversion of docx to pdf. A quick look on Nuget revealed Free Spire.Doc for .Net. We thought this looked like a great use case for Azure Functions, so the member of the team who was implementing the feature quickly implemented this as an Azure Function, but when we deployed it, it didn't work. After a bit of googling, the cause of this was that, due to the sandboxed environment of Azure App Service, assemblies requiring access to GDI don’t work. Further Googling with Bing revealed that pretty much every .Net docx to PDF conversion library uses GDI, so we couldn’t just switch to a different library.
We needed full access to Windows to do this, which meant a VM. The cheapest Windows VM in azure is a Basic A0 at less than £9 a month, much cheaper than commercial document conversion services I found, which were at least £20 a month and had really weird APIs that were going to be pretty tricky to integrate in to our application.
I implemented a Windows service using Topshelf and the original Free Spire.Doc code for the actual conversion and installed this on to the VM. It simply polls an Azure Storage Queue for a message and deserializes the body to the following class.
private class ConversionMessage
{
    public string SourceBlobContainer { get; set; }
    public string SourceBlobName { get; set; }
    public string DestinationBlobContainer { get; set; }
    public string DestinationBlobName { get; set; }
    public string ConversionType { get; set; }
}
This simply contains the Azure Blob container and the name for the source document, and the destination container and name for the converted document. There is also a ConversionType property which only has one valid value currently, I added this to facilitate adding other conversions in the future. When a message is received, the service then converts the document with freespire and puts the converted document in the destination container. Below is all the code for doing the conversion and saving it.
private void Convert(ConversionMessage message)
        {
            Console.WriteLine(message);

            var inputBlob = GetBlobReference(message.SourceBlobContainer, message.SourceBlobName);
            var outputBlob = GetBlobReference(message.DestinationBlobContainer, message.DestinationBlobName);
            
            if(message.ConversionType == "docxtopdf")
            {
                LogInfo("Beginning conversion, type: docxtopdf");
                ConvertDocxToPdf(inputBlob, outputBlob);
            }
            else
            {
                LogError($"Invalid conversion type {message.ConversionType} received");
            }
        }

        private CloudBlockBlob GetBlobReference(string container, string blobName) => _blobClient.GetContainerReference(container).GetBlockBlobReference(blobName);

        private void ConvertDocxToPdf(CloudBlockBlob inputDoc, CloudBlockBlob outputDoc)
        {
            var inputStream = new MemoryStream();
            inputDoc.DownloadToStream(inputStream);

            inputStream.Seek(0, SeekOrigin.Begin);

            var doc = new Spire.Doc.Document();
            doc.LoadFromStream(inputStream, FileFormat.Docx);

            var outStream = new MemoryStream();
            doc.SaveToStream(outStream, FileFormat.PDF);

            outStream.Seek(0, SeekOrigin.Begin);

            outputDoc.UploadFromStream(outStream);

            LogInfo("Conversion successful");
        }
Holding all of this together is an Azure Function. This function is really simple, it just gets called whenever the docx file is created in Azure Blob Storage and creates the conversion message, and puts it in the queue for the Windows service on the VM to pick up.
public static void Run(CloudBlockBlob myBlob, CloudQueue queue, TraceWriter log)
{
    log.Info($"ConvertWordQuoteToPdf function processed: {myBlob.Name}");

    var filename = System.IO.Path.GetFileNameWithoutExtension(myBlob.Name);
    var cm = new ConversionMessage();
    cm.SourceBlobContainer = "docs";
    cm.SourceBlobName = $"{filename}.docx";
    cm.DestinationBlobContainer = "docs";
    cm.DestinationBlobName = $"{filename}.pdf";
    cm.ConversionType = "docxtopdf";

    var msg = new CloudQueueMessage(Newtonsoft.Json.JsonConvert.SerializeObject(cm));
    queue.AddMessage(msg);
}
One of the coolest things about this in my view, is that all of this required no changes to the main application at all, we just reacted to the creation of the docx file that it was already doing.

Sunday 1 January 2017

Automating repetitive tasks with Azure Functions.

Since the announcement of Azure Functions at Build 2016, I've been looking for an excuse to use them and I finally found it.
Whenever we release a new version of our software, the actual process for building and committing to source control is really simple and takes just a few minutes. Then I get to spend  at least 30 minutes on our internal job management system telling it about the new release and deprecating the old one. I then copy and paste the message from the release commit, enter it in to said system, then reformat it a bit and send it out to various internal users as release notes.
It's just wrong when doing admin after a release takes more time than the actual release, plus I hate doing boring repetitive takes that I know a computer could do for me.

Enter Azure Functions, below is a quick diagram of how I wanted everything to work.



Source control calls a function when we commit. This function determines if it is a release by checking for a specific string in the message, it then puts a message on a queue to another function which will then enter the release details straight in to the database of our internal system (which we host in azure). This system doesn't have an API so we'll do it the old fashioned way with raw SQL.
If this is successful, it then puts messages on to 2 more queues, one is picked up by another function which posts the message in to Slack, the other goes off to a pre-existing web job which will email the release notes out.

SETUP

I created a function app in the azure portal and hooked up a repository in my Bitbucket account to the function app.
I'm going to need to access a SQL database, so I put the connection string in my app settings same as for any other App Service app.

Let's see what the code looks like.


This first function simply takes a small json payload over a http post, this structure of this is as below.

{
    "Revision":"c1b49afddh7c",
 "Author" : "Author Name",
 "Created_At" : "2016-09-27T11:54:58+00:00",
 "Log" : "Updated version to cpm 1.9.40
    
    Added fluffy bunny controller",
 "Branch" : "develop",
    "Project":"cpm",
 
}
#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"

using System;
using System.Net;
using System.Threading.Tasks;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage.Table;

public static async Task<object> Run(HttpRequestMessage req, ICollector<CommitMessageTableEntity> commitLogTable, ICollector<<ommitMessage> releaseQueue, TraceWriter log)
{
    string jsonContent = await req.Content.ReadAsStringAsync();
    log.Info(jsonContent);
    var data = JsonConvert.DeserializeObject<CommitMessage>(jsonContent);
    
    log.Info(data.Log);
    
    var te = new CommitMessageTableEntity();
    te.Set(data);

    try
    {
        commitLogTable.Add(te);
    }
    catch (System.Exception ex)
    {
        log.Info("An error occurred: " + ex.Message);
        return req.CreateResponse(HttpStatusCode.OK, "An error occurred, see log for details.");
    }

    if(te.IsRelease) {
        log.Info("This is a release, adding to queue so it gets added to Radius.");
        releaseQueue.Add(data);
    }

    return req.CreateResponse(HttpStatusCode.OK, "Success");
}

We log this to an Azure Storage table, I've no use for this currently but it costs practically nothing and is an easy way to check if the function was called if I have any problems in the future.
Then if the commit was a release commit, as defined by checking for the commit message to begin in a specific way, we put a message on a storage queue for the next function to be triggered. Note that I'm just adding to ICollector in both cases, the are no explicit references to Tables or Queues which is one of the things I really like about functions.


Here is the function.json, this defines all my inputs and outputs, note how the output queue name shows up as an Icollector<string> parameter on my run method, this is the power of the WebJobs SDK at work.

{
  "bindings": [
    {
      "webHookType": "genericJson",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req"
    },
    {
      "type": "http",
      "direction": "out",
      "name": "res"
    },
    {
      "type": "table",
      "name": "commitLogTable",
      "tableName": "commitlog",
      "connection": "SourceIntegrationSA",
      "direction": "out"
    },
    {
      "name": "releaseQueue",
      "queueName": "releasequeue",
      "connection": "SourceIntegrationSA",
      "type": "queue",
      "direction": "out"
    }
  ],
  "disabled": false
}

The next function is the one that does all the work. There is a large amount of SQL in this function. The key for me was to get this up and running quickly and using SQL does that. Once it has proved it's worth, I'll tidy it up a bit with all the time it saves me!
I haven't included the code here as it is very specific to me, all it does is use Dapper to insert a record for the new release and deprecate the previous one.
To import Dapper from Nuget, I added a project.json, see below.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "Dapper": "1.50.2"
      }
    }
   }
}

With this in place, whenever I redeploy my function, the Dapper Nuget will be downloaded for me to use.

Right at the end, once everything had been done, we pop a simple message on a queue to be picked up by another function that will send it to slack, and put the release notes from the commit message on another queue to be emailed out to internal staff members who will need to know.


The Slack function is really simple. It uses a Slack client class I found on the internet somewhere and just passes the message along.

Sunday 3 January 2016

Windows 10 Mobile: An update

Even though my previous post was only published 6 days ago, it was written a week before that so I’ve now got 3 weeks of usage under my belt and felt the need to post an update.

The biggest issue I’ve encountered over the last few weeks, and which I forgot to incude in the previous post, was battery life. This was suffering quite a bit in diaily use to the point where regular use of email, Readit, Facebook and Twitter was rendering the battery dead before dinner time. To combat this, I reduced the telemtry levels down which seemed to have no effect. I was on the verge of quitting my bold experiment and retreating back to WP8.1 when Readit released an update with improved performance which also seems to have solved my battery problems, so it seems the issues weren’t with the OS itself, just a single app. It’s a poignant lesson about how a single regularly used app can uniwttingly completely alter your perspective of a platform.

My other issue, which I did highlight in my prevous post is performance. I’m glad to say that shortly after that post, this seemed to spontaneously improve dramatically. There were no new builds in this time period, I don’t know if there was some indexing going on that was taking an age to complete and was chomping CPU cycles, but all seems well now. I think the aforementioned performance issues in Readit may have also played a part in my negative impressions.

I am still loving the ability to reply to a text without unlocking. My security concerns aside, it is really convenient!

I’ve just started working on my first app, a basic music/podcast playing app with Band control integration (as MS have deemed to only update Band 2 with music controls, not Band 1, I’m doing my own!) While the model is quite different to what I am used to as a Win32 desktop and Web developer, it is quite consistent and I’m slowly starting to get my head around things. Debugging my app on the physical phone has been completely painless, although I wish I could say the same about using the emulator!

Just 3 days ago, I was seriously contemplating going back to 8.1. I am glad to say that I am now pretty happy with WM10. It’ll be interesting to see if Microsoft keep the same pace of development with WM10 after release as they have with Windows 10 desktop, of if they repeat their previous mistake of releasing a new version and then dropping focus and putting their efforts in somewhere else.

Realistically, MIcrosoft are never going to unseat Android and IOS from the 1 and 2 spots, but there is still a chance of carving a decent market share as a third-player, but only if they don’t screw it up again. This is their chance to waste, I hope they don’t.

Monday 28 December 2015

Windows 10 Mobile: Impressions after a week.

I’ve been keeping a close eye on Windows 10 Mobile, and I haven’t really liked what I’ve seen. The navigation is a clear lift from Android, and not in a good way, and paradigms like the Pivot control which made WP8.1 unique seem to have disappeared. But I’ve never been one for sticking my head in the sand and sticking to the current version of something because the new version looks scary and different so, rather than wait for WM10 to be released, I decided screw it, I’ll get what is essentially the RTM build on my Lumia 1020 by joining the Insider Preview programme.

This was about a week ago, and these are my impressions after using WM10 on my daily driver for a week.

The Upgrade

 

The upgrade itself went very smoothly. Took about an hour and everything was exactly where I left it when it came back, even down to the Start layout which wasn't preserved when upgrading 8.1 to 10 on the desktop. Kudos for attention to detail though, as I had the old neutered Office app pinned to my start screen, the upgrade downloaded the new Excel, Word, and Powerpoint apps and put them in a tile group in the same spot where the old Office app was pinned. Not a massive feat of software engineering, but a nice touch.

Overall, the upgrade is much like going from Windows 8.1 to 10 on the desktop, generally a non-event with a few niggles that I'm confident will disappear over time.

Navigation

It’s still screwy, although not as bad as I expected. The mail app for example, has the old ellipsis menu at the bottom of the page AND the new hamburger menu at the top of the page. This is just plain confusing and I hope that apps will settle on using one method of navigation over time, even if it is the hamburger. I'm yet to decide if the "hold down the Start button to bring the screen down so you can access controls at the top of the phone with one hand" feature is a nasty hack to get around the idiotic decision to follow Android and put all navigation at the top of the phone away from the user's hand, or a clever trick to get around the idiotic decision to follow Android and put all navigation at the top of the phone away from the user's hand.

Performance

My initial impression of performance was that it was generally comparable to Windows Phone 8.1 on the same device, slower in some areas but faster in others. After a few more days of working with it however, it is definitely slower overall. Loading the main apps I use, such as Mail, Twitter, and Readit, can take seconds. Once they’ve loaded, performance is about the same as on WP8.1, the only issue here is initial load time.

Miscellaneous

The settings app is immeasurably better. The old one was just a completely unorganised list of options, with no sane grouping and a number of really useless names that don’t help you figure out where to find the setting you want. The new one follows the same layout and groupings as Windows 10, so that’s one advantage to the OSes sharing a larger amount of functionality these days.

The tiles are larger than on Windows 8, however the “Use More Tiles” option makes them too small in my opinion. Guess you can’t please everyone!

The ability to reply to texts without unlocking your phone is really convenient, although I’m slightly concerned about the security issues with such a feature.

Conclusion

My overall impression is that this initial release is a little rough. Obvious, keep in mind that even though build 10586 is RTM, I’m still running the insider preview so there may be extra telemetry enabled and reduced optimisations that may be hampering the performance. Is it as smooth as WP8.1? No. However, I still prefer it to Android even in it’s current state, and given that Windows 10 has improved from it’s already pretty stable condition upon release, if Microsoft keep up the same pace with WM10 as they did with desktop post-release, then I’m confident the rough edges will be gone fairly soon.

Will WM10 make Windows Phone a mainstream consumer phone OS finally? Again, no, I highly doubt it. But there are a lot of benefits to the shared code, features, and manageability of Windows 10 and it’s various derivations, including WM10, that may look very tempting to businesses, especially as that space is being rapidly vacated by Blackberry, there is room for a new mainstream business phone, and WM10 may just have a chance there.