I’ve been experimenting with different scenarios on Service Fabric and one came to mind immediately. Uploading large, multi-part files. I got a working prototype working in short order. I still need to fully vet it to ensure that it scales but it “works on my machine”. 😊 You can go get my code from GitHub.
Since the files that I’m uploading are large blobs I opted for a stateless service that will handle the traffic but then save the data into Azure Blob storage. I thought it might be interesting (at least from a theoretical perspective to try and get it working using stateful Actors. One actor per upload. However, it just doesn’t seem the right place to store large quantities of binary data. I could possibly use Actors to represent the transmission and store meta data such as % complete, file information, and possibly chunks of bytes received to get really fancy. The actor could essentially allow a client to ‘restart’ an upload in case it failed to complete for whatever reason. It might be an interesting evolution of this little lab.
In my Stateless Service, I’ve defined two Service Instance Listeners. One that hosts static HTML (WebCommunicationListener) and one that handles HTTP requests that upload files according to the Kendo UI specifications (i.e. FileUploadListener).
Static HTML Content
I new that even though I was planning on uploading files to this handler using a Raspberry Pi I wanted to test it using an HTML page. So I added a WebCommunicationListener that launched a static HTML page from the Physical File System using OWIN. Note the following NuGet packages required:
Bolded packages are ones that you need to install. The others are installed due to dependencies drawn by the bolded packages.
I loaded a fully licensed copy of Kendo UI into wwwroot and configured it to my liking.
I pointed the upload control at an endpoint that I would define later:
Finally testing the static HTML content loading I saw the below expected result. Ok, web client, CHECK. Now onto the HTTP listeners that are going to make the upload work.
File Upload Listener
My OpenAsync was vanilla enough.
I simply setup a new Task to process the request asynchronously.
This is the real magic:
Deploying this code allowed me to hit the URL endpoint I setup (i.e. http://localhost:8083/save) and see a simple “Hello” string sent back. I’m handling requests and sending back a response. This is the basic anatomy I needed. Now to process the request.
I then added a method to upload the data to Blob Storage. I know that I’ll need a stream of the data and a filename (should be included in the request somewhere)
So I started off by interrogating the request object I get on the HttpListenerContext and found an input stream. That looked like a winner. So I hooked into that not knowing where I’d find the filename.
It worked like magic. At least, files were making it into blob storage.
I’ve implemented file upload handlers in .NET before. Good old ASHX. Previously I’ve always been accustomed to the Content of the Request being fully parsed. This is not the case when using HttpListenerContext in Service Fabric. All you get on the request object is an InputStream. I initially wrote out this stream to blob and got some funny results. Here’s what it looks like:
I recognized “Content-Disposition” and “Content-Type” from my days of ASP.NET and writing ASHX HTTP handlers to do largely the same task but like I said, they were always on a parsed Content object. In order to parse this content I realized I had two options: parse it myself or find a library to do it. I quickly found this on StackOverflow. The accepted answer recommended a Multipart Parser that was available on codeplex. I was about halfway through implementation when I noticed the next answer (similar number of up votes but not the accepted answer) where you could use an out of the box libraries from Microsoft (one of which was included in .NET 4.5). I proceeded to rip out the previous implementation and replace it with the one using System.Net.Http and System.Net.Http.Formatting.
Here’s what you need:
- Included in .NET 4.5
- For .NET 4 get it via NuGet
I proceeded to up vote the answer. I could tell that I had a winner.
I proceeded to take the sample implementation from StackOverflow. It was relatively elegant. Taking in a delegate that just happened to have the same method signiature as my original process method. Fancy that! 😊
Now instead of just using the Request object’s raw input stream as the parameter to my processing method I pass my processing method as a delegate to the MultipartParser.
Kendo thinks it worked.
File made it to Azure Blob Storage and I was able to validate it was just the file content. Brilliant.