Feeling Suave

SuaveIO logoWell it’s been a crazy few months and I’m back to picking up my blog. I’m rethinking how I tried handling the blog last year, and will instead be rebooting it as a place to catalog projects that I work on. This is mostly so I won’t lose track of what I learn, since my current gig doesn’t open source much of anything.

As of late I’ve been splitting my time between developing MicroServices and WebApplications in .Net and Developing Chef Cookbooks.  I’ve done some of the Chef Development outside of work, as I didn’t want reusable cookbooks to go to waste. I’m super enthused, as my web deployment cookbook actually had someone contribute to it, and has been getting use by folks other than me. WebDeploymentToolkit is currently just a resource provider and documentation on how to use it. However I’m hoping to eventually extend it some to also use regiis encryption. I’m also thinking of helping out the IIS cookbook opscode has with some refactoring just to reduce the warnings that get thrown at me. Back at work however in addition to my development, I’ve been teaching courses in chef to help inspire my coworkers to develop their own cookbooks, as alot of folks don’t have any development in their background. Given the increasing amount of both discussion and use of it, I believe I’m infecting my coworkers with knife skills.

Outside of work I’ve been focused on Data Science. The Podcasts Talking Machines and Partially Derivative have been great, inspiring me to learn as much as I can. As a fun side project I’ve been attempting to make a retail analysis engine to provide some simple reporting. My first task: Sales Forcasting. Now before I give the link, I’ll note anyone with a statistics background will find it laughable, and anyone with a strong algorithmic background may be overwhelmed with nausea. Suffice to say InventoryForcast is a far cry from Jet.com’s pricing engine, but it has been a great way to start recovering my math skills. After many iterations, I’m now looking at F# to write the independent service that performs that analysis and calculations and I’ll just use something like MVC for the front end. Starting off with F# has been nothing short of earth shattering, getting out of my Object Oriented comfort zone is enlightening. I’m falling in love with Suave (featured above) a super light weight functionally written web server. The HTML Data Provider also appears to be the Bee’s Knees,  as I’m figuring out how to scrape data sources with it, which means at some point I’ll be able to build new and interesting data sources! Sadly I didn’t get to most my development wishlist from last year, but I’m starting fresh and hoping to stay sharp this year.


A Multitude of Adventures

So my updates to the project I just started have been sparse so far due to my trying to find time for it while on a honeymoon with my wife. I dare say I am pleased however with that I have gotten done so far, building a model of sale based on the Adventure Works Database. Browsing through the data (although I probably shouldn’t be) I am impressed with  the quality and wealth of test data they have crammed in it. It has made me rethink a few things and I am now making a better quality application that would function as more than a proof of concept. I am more or less done with the ability to list and delete sales, I still have to test adding them and then repeat the same process for line items of sales. Once that is complete I’ll have a dynamic warehouse and display for data, so I can start on a controller for the actual business analytics dashboard, which is the purpose of this exercise.

I had planned on spending my Sunday and Monday working on this project  (along with unpacking and general recovery), however as we were traveling back toward home my wife fell prey to appendicitis. She pulled through the surgery with flying colors, and only now do I find myself with enough time to stop worrying about her to instead sit on my computer while I watch her sleep. The trip has otherwise been magnificent, though exhausting to try and see everything. Since this happened shortly after crossing the Canadian Border we are up in Buffalo and will sadly be making the trip home tomorrow, which hopefully won’t give her too much discomfort.

Personal Interference, Sample Database, and Data Analysis

So this time I’ll blame my personal life for taking up all my free time, as only now on my honeymoon do I seem to find time to update my Blog. But that aside a lot changed since my last post, I’ve started writing chef cookbooks for work to provide a service oriented architecture for our applications. We have extended to automation system with more deployment types and I’ve been teaching my coworkers both it’s function and how to extend it. My coworkers are now all proficient enough to on board an application, and we are extending our number of deployments weekly. I personally have about 20 applications in the pipeline at the moment (.net and java) all of which will be deployed by using the system and Chef. I’m also hunting around to find a modern SSO mechanism like OAuth or SAML as most of our applications leverage custom solutions, which I would like to curb by offering intro courses to our developers.

In my personal life I’ve been fiddling with machine learning off and on, learning because it has been of great personal interest to me. I’ve since been inspired to create a business analytics dashboard oriented toward the retail space. I’ve started scaffolding a bit, thinking that I’d use webapi, signalr and r.net to run the numbers, however I may shirk R.net for the time being given my immediate task doesn’t require it. I’ve been given a methodology a business has been using for predicting their needed inventory stock, and they are hoping to extend it into a more real time application that doesn’t require manual updates.


I’ve torn the application apart 3 times now due to how I keep changing my thoughts on the exact execution, but I think things will come together soon as I settle on an ideal architecture. I gathered an Adventure Works sample database from Microsoft (http://msftdbprodsamples.codeplex.com/releases/view/55330) which should make for a good starting base of the sort of data I’ll deal with. To make it near real time, I’m thinking I’ll make the data analysis event driven, and store reporting calculations in the database. By doing so I shave of compounded calculation times and can later leverage a similar architecture when running larger information or more complex queries. A minimum viable product is the goal of this exercise however, so I am cutting a few corners on the first run, and will extend it iterative releases.

I’ll update more on the project, most likely this week once I’m done setting up my mobile development environment. (I’m traveling abroad at the moment, so I’m using my laptop which has traditionally been an implement for web browsing.)

Cooking all weekend

So I’ve wrapped up most of my Learning Chef Book and I have to say it was very good at getting me started. It took about two dedicated weekends to get through most everything in detail (including lab set up, etc.) which I think is pretty good. To be fair I have used and played with chef before in a Unix setting, but beyond writing recipes and simple tasks I’ve not done much. I’ve started translating the information over for windows administration, and with the supplement of Chef’s own documentation I’ve been doing pretty well. I’ve been fiddling with the IIS Cookbook to try and configure and set up a windows box, and instead of making my own box I’ve been using opentable/win-2012r2-standard-amd64-nocm from Vagrant’s website. The Box has been treating me well, though I’ve run into some issues with the IIS Cookbook, which appears to stem from the base windows cookbook.

       [2015-08-30T10:42:43-07:00] WARN: Using an LWRP provider by its name (WindowsFeatureDism) directly is no longer supported in Chef 12 and will be removed.  Use Chef::ProviderResolver.new(node, resource, action) instead.

The fact that the base windows cookbook uses DISM for installing windows features caught me off guard, and it appears to not support out of order installations or dependency resolution. The fact that DISM names and PowerShell names vary also caught me off guard since I’m used to leveraging PowerShell for feature installation. Oddly, that isn’t one of the supported providers that the cookbook can leverage, so I’ve forked the cookbook to try and add the PowerShell provider. So if you take a look at the Repo you’ll see my first cookbook is windows! I’ve actually got several local, but I’m looking forward to trying to extend one by opscode as I think that will give me a way better understanding of making complex cookbooks. With the realization of the number of cookbooks I’ll likely be writing, I made an organization for my Cookbooks: https://github.com/wildbillcat-cookbooks which is where you can find my windows fork.

Tomorrow I’ll hopefully get started implementing chef  at work, with my first project this week setting up an IIS Web Farm so we can use a shared hosting environment for our applications. I also plan on experimenting with Microsoft’s web deploy to see if we can leverage chef as a part of our deployment workflow. Chef has put so much hope on the horizon I’m pretty excited about the possibilities it unlocks for our application management. Combined with the deployment pipelines in TFS we should be hitting a completely automated cloudy horizon some time this year. I look forward to seeing the contrast!

Starting From Scratch

Well, my AOCMDB project is going to fall by the wayside for a while. Work has has me overwhelmed the past few months with internal changes and a lot of moving targets. I’ve implemented a new orchestration system using MS build and PowerShell for deployment automation which is very extensible, allowing for the addition of more deployment types with minimal effort. Being that it is mostly in PowerShell we can move it to an alternative build system down the road as I see Jenkins on our horizon.

It has been live for a few weeks now, on boarding applications to it has been a little bumpy but I look forward to running reports later this year to see how many more deployments we perform, along with their average deployment time. I’m still extending the number and types of deployments it can do, and am looking into integrating it with Chef, which we are also currently implementing . By this time next year I don’t think our set up will be recognizable from what it used to be, as I’ll hopefully have driven all of our cloud initiatives into production, and our application developers will be focused on newer application architectures.

Presently for work, my primary focus has been our application deployment automation and Oracle Database Automation. However I’m now focusing all of my free time on Chef, and to a lesser extent CoreOS. I’m hoping to set up a CoreOS cluster and start using docker in production as well, with our first production service being Artifactory. I’m looking at setting it up with Fleet, but I think Kubernetes will be the end state.

With these going, I’m feeling like I’m falling out of touch with the current web stack, so I decided to start developing a small web application of managing and sharing e-books. I found that Oreilly has a really nice sharing license on their eBooks (http://shop.oreilly.com/category/ebooks.do) which encourages letting my coworkers sneek peeks at my collection. Unfortunately I couldn’t find anything to moderate a private ebook collection that you could host yourself and offer community lending. This project will essentally be a bucket list of technologies I want to get up to speed on, and will hopefully serve as good instructional material in the future when the tech goes live. There’s alot of tech stacks I want to play with, so for my own reference I’m listing them here:

  • epub.js (http://futurepress.github.io/epub.js/)
  • Angular 2.0 (https://angular.io/)
  • ASP MVC 6 (http://www.asp.net/vnext/overview/aspnet-vnext)
  • TypeScript (Which Angular 2 is written in)
  • SignalR
  • BootStrap

Essentially I’m thinking of organizing it like a gallery of eBooks, users can add their own the allow other users to “rent” them. Using Angular 2 I can make it an SPA (Or at least portions), and using epub.js I can offer a web viewer. With SignalR I can monitor connections live for maintaining connection to e-books to optimize rental leases on epub files, maintaining a timer on how long an inactive viewing lease can be. (Leaning toward 30 minutes) That said I could  allow that to be set at a per book level, with just a generic default to allow users to customize. This would also enable users to take back their books by force.

I’ve been craving learning some machine learning stuff as well, so I’m thinking of trying out Apache Mahout. (http://mahout.apache.org/) Essentially it allows for an easier leveraging a lot of machine learning techniques, including leveraging Hadoop and Map Reduce. Considering the amount of data analysis we perform at work, I think in the future we could easily leverage this tech to more efficiently scale up our data crunching, as least once we have our cloud infrastructure automated.

It’s good to have my head above water again and working on my external interests. I’m excited to start the new project, although splitting my time between that, Chef and CoreOS/Docker will leave me wanting for time. But I’m excited for all of it, so hopefully I’ll make great strides!

Keeping Trendy

My session on MVC 5 & Angular went pretty well. It ended up being more of a show and tell of some of my code, and then a quick demonstration of how easy it is to implement a very simple application with angular. Essentially I just tied the content of an input box to a Div, but the fact that it takes all of 3 lines of code was seemed to impress. The actual room that I do it in is quite nice, but the screen is terrible. It’s old and is a 4:3 ratio that feels like 800×600 resolution, which had me collapsing toolbars left and right. Based on the impressions I got from the meeting, I’ve decided to split it into two courses. The first one on MVC5 and EF6 to show how to get started developing web applications, since there are a large number of people that don’t have web development backgrounds. The Second one will be using Angular with SignalR in an MVC application which was the big interest from the crowd. I am hoping to make both courses oriented as a crash course to show how quickly you can get up and running using the technology to show the appeal of why one might want to use it. There is a lot of sessions I want to do for the benefit of developers, I’m thinking of doing one of event driven programming next to show how to leverage delegates to create a subscriber model. Functional Programming is also on my to do list, so I’m thinking of fiddling with both F# and Node.js to see which I think would be easier to pitch in one of these sessions.

Outside of my crusade to get our developers to make more modern applications, I’ve been working on integrating our Custom TFS workflows with Microsoft Release Management. I’ve been leveraging the Rest API and have us set up to solely be a VNext Deployment setup. To get started I made use of the release template and PowerShell script made available by Microsoft (http://blogs.msdn.com/b/visualstudioalm/archive/2014/10/10/trigger-release-from-build-with-release-management-for-visual-studio-2013-update-3.aspx) but ran into an interesting issue. The XAML by default will trigger a release regardless of whether or not the compile step succeedes, which I hadn’t thought to check before using it on a test environment. Granted since it was a test environment it’s was hardly a serious issue, but it did cause me to modify the script to detect whether or not the compile had failed before allowing it to continue. I stashed my Script here (https://github.com/wildbillcat/PowershellAutomation/blob/master/ReleaseManagment/InitiateReleaseFromBuild.ps1) for my own reference, along with anyone else should they be implementing vNext templates as well.

We have a few contractors from Incycle Technology working for us who complimented me on having our setup completely vNext, which made me feel great. In that same vein I’m thinking of scrapping my idea of setting up azure pack, and instead setting up server 2016 running Azure Stack since launching new services seems to take so long. I want to keep the configuration managment automated in some way, so I’m thinking of finally getting around to brushing up on Chef this weekend. That way I can quickly spin up and put down the new environment, so when the next version of Server actually hits RTM I can quickly spin it up and have the infosec guys start tearing it apart so I can go live as fast as possible. I also want to get up to speed on chef, as we may start  leveraging that for Release Managment as an alternative to PowerShell since the RM software supports it and I’ve got my fingers crossed that our configuration management will be done with it. It’s kind of funny being a Microsoft fan lately, as more and more it seems like Microsoft is attempting to pitch itself as the new Sun Microsystems. The more Oracle closes down Java, the more Microsoft trys to position .Net as the obvious alternative. Since 5 is open sourced, I’m all for it, if something goes awry you can always defect to Node, Ruby, or Python, but all of those come with a steeper curb for a Java Dev.

A Coursera course has caught my eye on financial analysis, so I thought I’d give it a try to see if I might perhaps improve my overall knowledge of the field I am serving. (https://www.coursera.org/course/financialanalysis) It’s been a while since I’ve done something on Coursera and I’ve noticed they are now really pushing for the certificates to sell them. I’m tempted, if only because it appears to come with a capstone project to attempt to apply the knowledge which seems to be interesting. But at the same time I think saving my money for something like my Security+ certification (Likely the next certification I’ll go out for) seems more alluring, along with F# Deep Dives, published by Manning. Granted this just comes from a my frugality, if I had more cash to burn I’d likely just buy in no questions asked, as the cost is nominal. I’m the learning vein I also started a contest at work to try and promote sharing of knowledge around PowerShell since all of our technology teams use it. I am offering small 3D printed toys as prizes, with the hope that it instigates people to collaborate and share if only a little bit. I’m trying to promote using Pester, the BDD testing framework, good code commenting, and designing modular/reusable code through functions and parameters.


This week at work has been pretty crazy at work, I’ve been scrapping together our Release Management Environment to integrate it with our TFS environment. My coworker and I have come to an agreement on how we think it should be implemented, which means we might actually start getting deployments up and running inside the next month! I’ve also been doing a fair amount of programming to automate our SQL Deployments. I’ve completed the component that verifies the contents of the package to ensure nothing in there should be flagged for review, and I’m now onto the portion that does the heavy lifting.

Aside from that I’ve decided to start giving a monthly segment at work on new technologies to help improve our developer’s skill sets. I’m starting with MVC5 and Angular.js web applications with the hope that it will inspire our developers to get rid of the client footprint. Our developers have a lot of applications that use grids to view and edit information, so I’m thinking of using the MVC Grid, and then an example using NG Grid. I’m likely going to start working it up this weekend, leveraging entity framework. I’m considering trying to connect to an oracle DB, and perhaps trying to give samples with SignalR and D3.js if I can find the time. In the mean time I’ll probably whip up an azure environment to start testing with a new Git Hub Repo.

Now MS Build has also been going on which I’ve been watching. The event has been pretty cool, the highlight in my opinion being the Hololens. I can see myself living in an near empty apartment with just a bed and a couch, then building the rest of my environment virtually. Given I can just stretch a video across my wall, there’s no need for a TV. Attach photos to the walls, plants and organic videos throughout, I can have a view even in the basement. Docker and .Net VNEXT looks great for both development features and deployment, although explaining the benefits of of docker for our business use may be a battle. But paired with Azure pack, I have high hopes for our environment in the future. I look forward to catching up on MSDN and build this weekend, since there’s so much going on I can hardly process it all.

Now just to take a step back to note that where a frothing fan boy, there were a few notable thing that were also worrying. The machine learning (where technically speaking very cool) is also that sort of terrifying where statistical evaluation of habit results in what feels like a violation of privacy. Granted we all aren’t that unique, so perhaps that’s just the angst of having to admit that which inspires my discomfort. The second thing was the carrier billing across all platforms, which I’m sure will lead to all sorts of issues as kids can run up a bill without a credit card.

REBOOT: Windows 10 Technical Preview (x64) and Visual Studio 2015 CTP

So it’s been a while since I’ve updated my blog, having gotten caught up with a lot of things at work and in my personal life. But having now righted the ship, I’m back on track giving bland reviews of what I’m doing with my time. In this vien I also rethought what I’m doing with this blog, and trying to actively document what my technical activities as tutorials is way to burdensome to be realistic, along with potentially condescending to anyone who reads through this looking for information, given you probably have some idea of what your doing.

So while I reboot this blog, I’ve also decided to redo one of computing environments! (So hence the title, surprise surprise.) I’ve started using Visual Studio 2015 CTP, which has been looking brilliantly, and have set up a new project (Application Oriented Configuration Management Database) to play around in it. As can been seen by my code history, I was also trying out ASP.NET 5 (MVC 6) with Entity Framework 7. Overall it looks great and similar in function, but I spend so much time on small nuances that it was overall proving unproductive considering I’m hoping to have this project be live in a production environment this year. The straw that broke the Camel’s back was my fighting with getting code migrations to work, so I scrapped it for now and moved to a classic MVC 5 with Entity Framework 6.

Presently Visual Studio 2015 itself has been great. I love the new NuGet Explorer, I will definitely be getting the developers at my workplace onto as soon as possible. Easily browsing versions, updates, etc is fantastic. This is especially useful in our enterprise environment as we run an internal NuGet feed and have many version dependent applications. Most of our developers are already irked as they are moved to NuGet in general, and making them handle things using the NuGet command line in order to get required DLL versions has been a strong adoption sticking point.

My next favorite feature has been the suggestion light bulb, which pops up everywhere.  This was especially handy moving my code from MVC6/EF7 to MVC5/EF6  as it continued suggestion where I needed to change references, etc. It seems to throw more errors about references which has been somewhat perplexing, as I’ll have something appropriately referenced with NuGet, etc but it will still pop up until I rebuild a second time.

I’m about to start the install of Windows 10, so I’ll be signing off here. But before I do one thing I wanted to also note was the excellent NewHaven.IO meetup I went to on pair programming  last Thursday. (http://www.meetup.com/newhavenio/events/220445501/) It was excellently run and really inspiring, even though the entire stack was ruby instead of windows. But seeing how effectively it worked for them was great, and I hope we could get similar infinitives started at AIG. My thinking as with the advent of ASP.NET 5 I can launch educational campaign at work for both new technologies and practices to get us using technologies that play nicer with DevOps deployments.

Getting Started!

So it’s time to get started using the Lab. I assume you are using the same layout as my second  post on lab creation Moving Target. My development machine is actually a Windows 7 Virtual Machine with RSAT, MMF 4.0, and Visual Studio Ultimate. Being a virtual machine isn’t actually important, I personally use Windows 8.1, but to more closely match my present work environment I wanted to make sure I was running Windows 7 so my experience would more easily translate into my professional life. One other configuration I’ll note that I didn’t realize: install your copy of visual studio onto the build controller. This gives the build controller the correct software with which to compile whatever you make, which will save you troubleshooting why it doesn’t build in the beginning.

I’ve been working on getting my web application up and running the past few days called TFSWorkItemTracker. It’s pretty cool so far, I’ve implemented SignalR with AngularJS on an MVC5 application for near real time updates on a TFS server. I’ll later be using this application for an automated IIS deployment, so that I can push this application from development to production.  I’ll note that since I have more experience as a Git user, I am using git as my version control in these exercises.

Lets get started! I would tell you to head to your empty TFS server and  make a new Team Project to keep our code in. But in TFS 2013 update 4 they don’t have a new button for project collections, even if you hit the gear in the top right corner to bring up the TFS administration page. Instead, you have to use Visual Studio, so let’s do just that!

Open the Team Explorer and hit the plug to manage your TFS connections:1-TeamExplorerClick Servers to manage the TFS Server List:2-TeamExplorerClick Add and enter the URL of your TFS server. The add server dialog will have all the default settings for TFS prefilled. If you customized the path or port number, be sure to correct it accordingly.3-TeamExplorer 4-TeamExplorerHit OK and Close  the dialogs. Now that the connection to TFS is set up, go to File->New->Team Project… and enter a name for your Team Project and hit next. If you are following along exactly, the use TSFWorkItemTracker. I’m a Scrum fan, so I stuck with that. But you can use whatever process template works for you (or in the future we may make one.) 2-TeamProjectThen select Git as your Version Control System (again to match what I am doing) and hit Finish to skip the confirmation page.

3-TeamProjectThen Visual Studio will work on fabricating your project, usually taking a few moments in my experience. If you have Sharepoint integration this could take longer.4-TeamProjectNow hit the plug to get back to the connections page:1-TeamExplorer2Click clone on local git repositories and enter the path to a git repo you want to work on. (I selected my own). Then click clone.2-TeamExplorer2Once it is done cloning, double click the project under local repositories:3-TeamExplorer2Click changes, then select open command line: 4-TeamExplorer2In the command line add the TFS server to the git remotes and push the repo up onto the server.


Now click the plug and open the TFS project:6-TeamExplorer2

Finally hit clone this repository and your ready to start building and deploying it.7-TeamExplorer2 8-TeamExplorer2Now doing all the screenshots for all this was pretty tedious, so I think in future installments I’m going to rethink the format. Stay tuned.

Moving Target

So my neat little phases instead turned into a 2 Phase Project for Lab Construction, with Phase 1 containing 7 Virtual Machines, configured as follows:

  • TestAD01
    • Server 2012 R2 Datacenter, 1024MB Ram, 60GB Hard Drive
    • Active Directory / DNS Roles (Domain: “test.internal”)
  • SQLServer
    • Server 2012 R2 Datacenter, 1024MB Ram, 60GB Hard Drive
    • SQL Server 2012 Enterprise SP2 x64 (For TFS DataTier)
  • TFS2013
    • Server 2012 R2 Datacenter, 2048MB Ram, 124GB Hard Drive
    • Team Foundation Server 2013 Update 4
  • TFSBuild
    • Server 2012 R2 Datacenter, 1024MB Ram, 124GB Hard Drive
    • Team Foundation Server Build Service
  • RMServer
    • Server 2012 R2 Datacenter, 1024MB Ram, 127GB Hard Drive
    • Release Management for Visual Studio 2013 with Update 4
  • SQLServer2
    • Server 2012 R2 Datacenter, 1024MB Ram, 60GB Hard Drive
    • SQL Server 2014
  • IISServer
    • Server 2012 R2 Datacenter, 1024MB Ram, 60GB Hard Drive
    • No Roles (Eventually this will have the IIS role installed, however I am looking to see if using DSC and the RM Server, we can automate the configuration of the VM)

All the VMs are practically ready to go, with just Updates now running! This weekend I can start making some build definitions and configure the release management Server. I’ll document the various means of build automation, etc in the coming weeks as I come up to each one. I’m not going to bother describing step by step how to set up the lab since it’s all straight forward, but I will note that my Team Foundation Server 2013 instance has both Reporting and SharePoint disabled since they aren’t needed for continuous integration.

The Phase two I likely won’t be this weekend since this will give me a lot to work with. However I presently imagine I’ll also be looking at setting up a Chef server to use it for configuring Linux VMs with Release Management, a NuGet Server (with deployments to it automated with Release Management). There will also likely be a lot of other deployment scenarios that I’ll think of as I get further entrenched at work and start rolling up my sleeves.