@Shazwazza

Shannon Deminick's blog all about web development

Examine and Azure Blob Storage

February 11, 2020 04:52
Examine and Azure Blob Storage

Quite some time ago - probably close to 2 years - I created an alpha version of an extension library to Examine to allow storing Lucene indexes in Blob Storage called Examine.AzureDirectory. This idea isn’t new at all and in fact there’s been a library to do this for many years called AzureDirectory but it previously had issues and it wasn’t clear on exactly what it’s limitations are. The Examine.AzureDirectory implementation was built using a lot of the original code of AzureDirectory but has a bunch of fixes (which I contributed back to the project) and different ways of working with the data. Also since Examine 0.1.90 still worked with lucene 2.x, this also made this compatible with the older Lucene version.

… And 2 years later, I’ve actually released a real version 🎉

Why is this needed?

There’s a couple reasons – firstly Azure web apps storage run on a network share and Lucene absolutely does not like it’s files hosted on a network share, this will bring all sorts of strange performance issues among other things. The way AzureDirectory works is to store the ‘master’ index in Blob Storage and then sync the required Lucene files to the local ‘fast drive’. In Azure web apps there’s 2x drives: ‘slow drive’ (the network share) and the ‘fast drive’ which is the local server’s temp files on local storage with limited space. By syncing the Lucene files to the local fast drive it means that Lucene is no longer operating over a network share. When writes occur, it writes back to the local fast drive and then pushes those changes back to the master index in Blob Storage. This isn’t the only way to overcome this limitation of Lucene, in fact Examine has shipped a work around for many years which uses something called SyncDirectory which does more or less the same thing but instead of storing the master index in Blob Storage, the master index is just stored on the ‘slow drive’.  Someone has actually taken this code and made a separate standalone project with this logic called SyncDirectory which is pretty cool!

Load balancing/Scaling

There’s a couple of ways to work around the network share storage in Azure web apps (as above), but in my opinion the main reason why this is important is for load balancing and being able to scale out. Since Lucene doesn’t work well over a network share, it means that Lucene files must exist local to the process it’s running in. That means that when you are load balancing or scaling out, each server that is handling requests will have it’s own local Lucene index. So what happens when you scale out further and another new worker goes online? This really depending on the hosting application… for example in Umbraco, this would mean that the new worker will create it’s own local indexes by rebuilding the indexes from the source data (i.e. database). This isn’t an ideal scenario especially in Umbraco v7 where requests won’t be served until the index is built and ready. A better scenario is that the new worker comes online and then syncs an existing index from master storage that is shared between all workers …. yes! like Blob Storage.

Read/Write vs Read only

Lucene can’t be written to concurrently by multiple processes. There are some workarounds here a there to try to achieve this by synchronizing processes with named mutex/semaphore locks and even AzureSearch tries to handle some of this by utilizing Blob Storage leases but it’s not a seamless experience. This is one of the reasons why Umbraco requires a ‘master’ web app for writing and a separate web app for scaling which guarantees that only one process writes to the indexes. This is the setup that Examine.AzureDirectory supports too and on the front-end/replica/slave web app that scales you will configure the provider to be readonly which guarantees it will never try to write back to the (probably locked) Blob Storage.

With this in place, when a new front-end worker goes online it doesn’t need to rebuild it’s own local indexes, it will just check if indexes exist and to do that will make sure the master index is there and then continue booting. At this stage there’s actually almost no performance overhead. Nothing actually happens with the local indexes until the index is referenced by this worker and when that happens Examine will lazily just sync the Lucene files that it needs locally.

How do I get it?

First thing to point out is that this first release is only for Examine 0.1.90 which is for Umbraco v7. Support for Examine 1.x and Umbraco 8.x will come out very soon with some slightly different install instructions.

The release notes of this are here, the install docs are here, and the Nuget package for this can be found here.

PM> Install-Package Examine.AzureDirectory -Version 0.1.90

To activate it, you need to add these settings to your web.config

<add key="examine:AzureStorageConnString" value="YOUR-STORAGE-CONNECTION-STRING" />
<add key="examine:AzureStorageContainer" value="YOUR-CONTAINER-NAME" />

Then for your master server/web app you’ll want to add a directoryFactory attribute to each of your indexers in ExamineSettings.config, for example:

<add name="InternalIndexer" type="UmbracoExamine.UmbracoContentIndexer, UmbracoExamine"
      supportUnpublished="true"
      supportProtected="true"
      directoryFactory="Examine.AzureDirectory.AzureDirectoryFactory, Examine.AzureDirectory"
      analyzer="Lucene.Net.Analysis.WhitespaceAnalyzer, Lucene.Net"/>

For your front-end/replicate/slave server you’ll want a different readonly value for the directoryFactory like:

<add name="InternalIndexer" type="UmbracoExamine.UmbracoContentIndexer, UmbracoExamine"
      supportUnpublished="true"
      supportProtected="true"
      directoryFactory="Examine.AzureDirectory.ReadOnlyAzureDirectoryFactory, Examine.AzureDirectory"
      analyzer="Lucene.Net.Analysis.WhitespaceAnalyzer, Lucene.Net"/>

Does it work?

Great question :) With the testing that I’ve done it works and I’ve had this running on this site for all of last year without issue but I haven’t rigorously tested this at scale with high traffic sites, etc… I’ve decided to release a real version of this because having this as an alpha/proof of concept means that nobody will test or use it. So now hopefully a few of you will give this a whirl and let everyone know how it goes. Any bugs can be submitted to the Examine repo.

 

 

Web Application projects with Umbraco Cloud

January 8, 2020 05:12
Web Application projects with Umbraco Cloud

This is a common topic for developers when working with Umbraco Cloud because Umbraco Cloud simply hosts an ASP.Net Framework “Website”. The setup is quite simple, a website is stored in a Git repository and when it’s updated and pushed to Umbraco Cloud, all of the changes are live. You can think of this Git repository as a deployment repository (which is very similar to how Azure Web Apps can work with git deployments). When you create a new Umbraco Cloud site, the git repository will be pre-populated with a runnable website. You can clone the website and run it locally with IIS Express and it all just works. But this is not a compile-able website and it’s not part of a visual studio project or a solution and if you want to have that, there’s numerous work arounds that people have tried and use but in my personal opinion they aren’t the ideal working setup that I would like.

Ideal solution

In my opinion the ideal solution for building web apps in .NET Framework is:

  • A visual studio solution
    • A compile-able Web Application project (.csproj)
    • Additional class library projects (as needed)
    • Unit/Integration test projects (as needed)
    • All dependencies are managed via Nuget
  • Git source control for my code, probably stored in GitHub
  • A build server, CI/CD, I like Azure Pipelines

I think this is a pretty standard setup for building websites but trying to wrangle this setup to work with Umbraco Cloud isn’t as easy as you’d think. A wonderful Umbraco community member Paul Sterling has written about how to do this a couple of times, here and here and there’s certainly a few hoops you’d need to jump through. These posts were also written before the age of Azure YAML Pipelines which luckily has made this process a whole lot easier

Solution setup

NOTE: This is for Umbraco v8, there’s probably some other edge cases you’ll need to discover on your own for v7. 

Setting up a Visual Studio solution with a web application compatible for Umbraco Cloud is pretty straight forward and should be very familiar. It will be much easier to do this starting from scratch with a new Umbraco Cloud website though it is more than possible to do this for an existing website (i.e. I did this for this website!) but most of those details are just migrating custom code, assets, etc… to your new solution.

I would suggest starting with a new Umbraco Cloud site that has no modifications to it but does have a content item or two that renders a template.

  • Create a new VS solution/project for a web application running .NET 4.7.2
  • Add this Nuget.config to the root folder (beside your .sln file)
    • <?xml version="1.0" encoding="utf-8"?>
      <configuration>
        <packageSources>
      	<add key="NuGet" value="https://api.nuget.org/v3/index.json" />
          <add key="UmbracoCloud" value="https://www.myget.org/F/uaas/api/v3/index.json" />
        </packageSources>
      </configuration>
  • Install the Nuget package for the same Umbraco version that you are currently running on your Umbraco Cloud website. For example if you are running 8.4.0 then use Install-Package UmbracoCms –Version 8.4.0
  • Install Forms (generally the latest available): Install-Package UmbracoForms
  • Install Deploy (generally the latest available):
    • Install-Package UmbracoDeploy
    • Install-Package UmbracoDeploy.Forms
    • Install-Package UmbracoDeploy.Contrib
  • Then you’ll need to install some additional Nuget packages that are required to run your site on Umbraco Cloud. This is undocumented but Umbraco Cloud adds a couple extra DLLs when it creates a website that are required.
    • Install-Package Serilog.Sinks.MSSqlServer -Version 5.1.3-dev-00232
  • Copy these files from your Umbraco Cloud deployment repository to your web application project:
    • ~/data/*
    • ~/config/UmbracoDeploy.config
    • ~/config/UmbracoDeploy.Settings.config
  • You then need to do something weird. These settings need to be filled in because Umbraco Deploy basically circumvents the normal Umbraco installation procedure and if you don’t have these settings populated you will get YSODs and things won’t work.
    • Make sure that you have your Umbraco version specified in your web.config like: <add key="Umbraco.Core.ConfigurationStatus" value="YOURVERSIONGOESHERE" />
    • Make sure your connectionStrings in your web.config is this:
      • <connectionStrings>
            <remove name="umbracoDbDSN" />
            <add name="umbracoDbDSN"
                 connectionString="Data Source=|DataDirectory|\Umbraco.sdf"
                 providerName="System.Data.SqlServerCe.4.0" />
        </connectionStrings>

But I don’t want to use SqlCE! Why do I need that connection string? In actual fact Umbraco Deploy will configure your web application to use Sql Express LocalDb if it’s available on your machine (which it most likely is). This is why when running Umbraco Cloud sites locally you’ll see .mdf and .ldf files in your App_Data folder instead of SqlCE files. Local Db operates just like Sql Server except the files are located locally, it’s really sql server under the hood. You can even use Sql Management Studio to look at these databases by connecting to the (localdb)\umbraco server locally with Windows Authentication. It is possible to have your local site run off of a normal Sql server database with a real connection string but I think you’d have to install Umbraco first before you install the UmbracoDeploy nuget package. Ideally UmbracoDeploy would allow the normal install screen to run if there was no Umbraco version detected in the web.config, but that’s a whole other story.

That should be it! In theory your web application is now configured to be able to publish a website output that is the same as what is on Umbraco Cloud.

Installation

At this stage you should be able to run your solution, it will show the typical Umbraco Deploy screen to restore from Cloud

image

In theory you should be able to restore your website and everything should ‘just work’

Working with code

Working with your code is now just the way you’re probably used to working. Now that you’ve got a proper Visual Studio solution with a Web Application Project, you can do all of the development that you are used to. You can add class libraries, unit test projects, etc… Then you commit all of these changes to your own source control like GitHub. This type of repository is not a deployment repository, this is a source code repository.

How do I get this to Umbraco Cloud?

So far there’s nothing too special going on but now we need to figure out how to get our Web Application Project to be deployed to Umbraco Cloud.

There’s a couple ways to do this, the first way is surprisingly simple:

  • Right click your web application project in VS
  • Click Publish
  • Choose Folder as a publish target
  • Select your cloned Umbraco Cloud project location
  • Click advanced and choose “Exclude files from App_Data folder’
  • Click Create Profile
  • Click Publish – you’ve just published a web application project to a website
  • Push these changes to Umbraco Cloud

The publish profile result created should match this one: https://github.com/umbraco/vsts-uaas-deploy-task/blob/master/PublishProfiles/ToFileSys.pubxml

This of course requires some manual work but if you’re ok with that then job done!

You should do this anyways before continuing since it will give you an idea of how in-sync your web application project and the output website is to the Umbraco Cloud website, you can then see what Git changes have been made and troubleshoot anything that might seem odd.

Azure Pipelines

I’m all for automation so instead I want Azure Pipelines to do my work. This is what I want to happen:

  • Whenever I commit to my source code repo Azure Pipelines will:
    • Build my solution
    • Run any unit tests that I have
    • Publish my web application project to a website
    • Zip the website
    • Publish my zipped website artifact
  • When I add a “release-*” tag to a commit I want Azure Pipelines to do all of the above and also:
    • Clone my Umbraco Cloud repository
    • Unzip my website artifact onto this cloned destination
    • Commit these changes to the Umbraco Cloud deployment repository
    • Push this commit to Umbraco Cloud

Luckily this work is all done for you :) and with YAML pipelines it’s fairly straight forward. Here’s how:

  • Go copy this PowerShell file and commit it to the /build folder of your source code repository (our friends Paul Sterling and Morten Christensen had previously done this work, thanks guys!). This PS script essentially does all of that Git work mentioned above, the cloning, committing and pushing files. It’s a bit more verbose than just running these git comands directly in your YAML file but it’s also a lot less error prone and handles character encoding properly along with piping the output of the git command to the log.
  • Go copy this azure-pipelines.yml file and commit it to the root of your git source code repository. This file contains a bunch of helpful notes so you know what it’s doing. (This pipelines file does run any tests, etc… that exercise will be left up to you.)
  • In Azure Pipelines, create a new pipeline, choose your Git source control option, choose “Existing Azure Pipelines YAML file”, select azure-pipelines.yml file in the drop down, click continue.
  • Click Variables and add these 3:
    • gitAddress = The full Git https endpoint for your Dev environment on Umbraco Cloud
    • gitUsername = Your Umbraco Cloud email address
    • gitPassword = Your Umbraco Cloud password - ensure this value is set to Secret
  • Click Run!

And that’s it! … Really? … In theory yes :)

Your pipeline should run and build your solution. The latest commit you made is probably the azure-pipelines.yml files so it didn’t contain a release-* tag so it’s not going to attempt to push any changes to Umbraco Cloud. So first thing to do is make sure that your your pipeline is building your solution and doing what its supposed to. Once that’s all good then it’s time to test an Umbraco Cloud deployment.

Deploying to Umbraco Cloud

A quick and easy test would be to change the output of a template so you can visibly see the change pushed.

  • Go ahead and make a change to your home page template
  • Run your site locally with your web application project and make sure the change is visible there
  • Commit this change to your source control Git repository
  • Create and push a release tag on this commit. For example, the tag name could be: “release-v1.0.0-beta01” … whatever suites your needs but based on the YAML script it needs to start with “release-“

Now you can sit back and watch Azure Pipelines build your solution and push it to Umbraco Cloud. Since this is a multi-stage pipeline, the result will look like:

image

And you should see a log output like this on the Deploy stage

image

Whoohoo! Automated deployments to Umbraco Cloud using Web Application Projects.

What about auto-upgrades?

All we’ve talked about so far is a one-way push to Umbraco Cloud but one thing we know and love about Umbraco Cloud is the automated upgrade process. So how do we deal with that? I actually have this working on my site but want to make the process even simpler so you’re going to have to be patient and wait for another blog post :)

The way it works is also using Azure Pipelines. Using a separate pipeline with a custom Git repo pointed at your Umbraco Cloud repository, this pipeline can be configured to poll for changes every day (or more often if you like). It then checks if changes have been made to the packages.config file to see if there’s been upgrades made to either the CMS, Forms or Deploy (in another solution I’m actually polling Nuget directly for this information). If an upgrade has been made, It clones down your source code repository, runs a Nuget update command to upgrade your solution. Then it creates a new branch, commits these changes, pushes it back GitHub and creates a Pull Request (currently this only works for GitHub).

This same solution can be used for Deploy files in case administrators are changing schema items directly on Umbraco Cloud so the /deploy/* files can be automatically kept in sync with your source code repository.

This idea is entirely inspired by Morten Christensen, thanks Morten! Hopefully I’ll find some time to finalize this.

Stay tuned!

How I upgraded my site to Umbraco 8 on Umbraco Cloud

November 12, 2019 06:33
How I upgraded my site to Umbraco 8 on Umbraco Cloud

I have a Development site and a Live site on Umbraco Cloud. You might have some additional environments but in theory these steps should be more or less the same. This is just a guide that hopefully helps you, it’s by no means a fail-safe step by step guide, you’ll probably run into some other issues and edge cases that I didn’t.  You will also need access to Kudu since you will most likely need to delete some left over files manually, you will probably also need to toggle debug and custom errors settings in your web.config to debug any YSODs you get along the way, you will need to manually change the Umbraco version number in the web.config during the upgrade process and you might need access to the Live Git repository endpoint in case you need to rollback.

… Good luck!

Make sure you can upgrade

Make sure you have no Obsolete data types

You cannot upgrade to v8 if you have any data types referencing old obsolete property editors. You will first need to migrate any properties using these to the non-obsolete version of these property editors. You should do this on your Dev (or lowest environment): Go to each data type and check if the property editor listed there is prefixed with the term “Obsolete”. If it is you will need to change this to a non-obsolete property editor. In some cases this might be tricky, for others it might be an easy switch. For example, I’m pretty sure you can switch from the Obsolete Content Picker to the normal Content Picker. Luckily for me I didn’t have any data relying on these old editors so I could just delete these data types.

Make sure you aren’t using incompatible packages

If you are using packages, make sure that any packages you are using also have a working v8 version of that package.

Make sure you aren’t using legacy technology

If you are using XSLT, master pages, user controls or other weird webforms things, you are going to need to migrate all of that to MVC before you can continue since Umbraco 8 doesn’t support any of these things.

Ensure all sites are in sync

Very important that all Cloud environments are in sync with all of your latest code and there’s no outstanding code that needs to be shipped between them. Then you need to make sure that all content and media are the same across your environments since each one will eventually be upgraded independently and you want to use your Dev site for testing along with pulling content/media to your local machine.

Clone locally, sync & backup

Once all of your cloud sites are in sync, you’ll need to clone locally – I would advise to start with a fresh clone. Then restore all of your content and media and ensure your site runs locally on your computer. Once that’s all running and your site is basically running and operating like your live site you’ll want to take a backup. This is just for piece of mind, when upgrading your actual live site you aren’t going to lose any data. To do this, close VS Code (or whatever tool you use to run your local site) and navigate to ~/App_Data/ and you’ll see Umbraco.mdf and Umbraco_log.mdf files. Make copies of those and put them someplace. Also make a zip of your ~/media folder.

Now to make things easy in case you need to start again, make a copy of this entire site folder which you can use for the work in progress/upgrade/migration. If you ever need to start again, you can just delete this copied wip folder and re-copy the original.

Create/clone a new v8 Cloud site

This can be a trial site, it’s just a site purely to be able to clone so we can copy some files over from it. Once you’ve cloned locally feel free to delete that project.

Update local site files

Many people will be using a visual studio web application solution with Nuget, etc… In fact I am too but for this migration it turned out to be simpler in my case to just upgrade/migrate the cloned website.

Next, I deleted all of the old files:

  • The entire /bin directly – we’ll put this back together with only the required parts, we can’t have any old left over DLLs hanging around
  • /Config
  • /App_Plugins/UmbracoForms, /App_Plugins/Deploy, /App_Plugins/DiploTraceLogViewer
  • /Umbraco & /Umbraco_Client
  • Old tech folders - /Xslt, /Masterpages, /UserControls, /App_Browsers
  • /web.config

If you use App_Code, then for now rename this to something else. You will probably have to refactor some of the code in there to work and for now the goal is to just get the site up and running and the database upgraded. So rename to _App_Code or whatever you like so long as it’s different.

Copy over the files from the cloned v8 sites:

  • /bin
  • /Config
  • /App_Plugins/UmbracoForms, /App_Plugins/Deploy
  • /Umbraco
  • /Views/Partials/Grid, /Views/MacroPartials, /Views/Partials/Forms – overwrite existing files, these are updated Forms and Umbraco files
  • /web.config

Merge any custom config

Create a git commit before continuing.

Now there’s some manual updates involved. You may have had some custom configuration in some of the /Config/* files and in your /web.config file. So it’s time to have a look in your git history. That last commit you just made will show all of the changes overwritten in any /config files and your web.config file so now you can copy any changes you want to maintain back to these files. Things like custom appSettings, etc…

One very important setting is the Umbraco.Core.ConfigurationStatus appSetting, you must change this to your previous v7 version so the upgrader knows it needs to upgrade and from where.

Upgrade the database

Create a git commit before continuing.

At this stage, you have all of the Umbraco files, config files and binary files needed to run Umbraco v8 based on the version that was provided to your from your cloned Cloud site. So go ahead and try to run the website, with any luck it will run and you will be prompted to login and upgrade. If not and you have some YSODs or something, then the only advise I can offer at this stage is to debug the error.

Now run the upgrader – this might also require a bit of luck and depends on what data is in your site, if you have some obscure property editors or if your site is super old and has some strange database configurations. My site is super old, from v4 and over the many years I’ve managed to wrangle it through the version upgrades and it also worked on v8 (after a few v8 patch releases were out to deal with old schema issues). If this doesn’t work, you may be prompted with a detailed error specifically telling you way (i.e. you have obsolete property editors installed), or it might just fail due to old schema problems. For the latter problem, perhaps some of these tickets might help you resolve it.

When you get this to work, it’s a good time to make a backup of your local DB. Close down the running website and tool you used to launch it, then make a backup of the Umbraco.mdf and Umbraco_log.mdf files.

Fix your own code

You will probably have noticed that the site now runs, you can probably access the back office (maybe?!) but your site probably has YSODs. This is most likely because:

  • Your views and c# code needs to be updated to work with the v8 APIs (remember to rename your _App_Code folder back to App_Code if you use it!)
  • Your packages need to be re-installed or upgraded or migrated into your new website with compatible v8 versions

This part of the migration process is going to be different for everyone. Basic sites will generally be pretty simple but if you are using lots of packages or custom code or a visual studio web application and/or additional class libraries, then there’s some manual work involved on your part. My recommendation is that each time you fix part of your sites you create a Git commit. You can always revert to previous commits if you want and you also have a backup of your v8 database if you ever need to revert that too. The API changes from v7 –> v8 aren’t too huge, you’ll have your local site up and running in no time!

Rebuild your deploy files

Create a git commit before continuing.

Now that your site is working locally in v8, it’s time to prep everything to be sent to Umbraco Cloud.

Since you are now running a newer version of Umbraco deploy you’ll want to re-generate all of the deploy files. You can do this by starting up your local site again, then open the command prompt and navigate to /data folder of your website. Then type in :

echo > deploy-export

All of your schema items will be re-exported to new deploy files.

Create a git commit before continuing.

Push to Dev

In theory if your site is working locally then there’s no reason why it won’t work on your Dev site once you push it up to Cloud. Don’t worry though, if all else fails, you can always revert back to a working commit for your site.

So… go ahead and push!

Once that is done, the status bar on the Cloud portal will probably be stuck at the end stage saying it’s trying to process Deploy files… but it will just hang there because it’s not able to. This is because your site is now in Upgrade mode since we’ve manually upgraded.

At this stage, you are going to need to login to Kudu. Go to the cmd prompt and navigate to /site/wwwroot/web.config and edit this file. The Umbraco.Core.ConfigurationStatus is going to be v8 already because that is what you committed to git and pushed to Cloud but we need Umbraco to detect an upgrade is required, so change this value to the v7 version you originally had (this is important!). While you are here, ensure that debug = false and CustomErrors = Off so you can see any errors that might occur.

Now visit the root of the site, you should be redirected to the login screen and then to the upgrade screen. With some more luck, this will ‘just work’!

Because the Deploy files couldn’t be processed when you first pushed because the site was in upgrade mode, you need to re-force the deploy files to be processed so go back to kudu cmd prompt and navigate to /site/wwwroot/data and type in:

echo > deploy

Test

Make sure your dev site is working as you would expect it to. There’s a chance you might have missed some code that needs changing in your views or other code. If that is the case, make sure you fix it first on your local site, test there and then push back up to Dev and then test again there. Don’t push to a higher environment until you are ready.

Push to Live

You might have other environments between Dev and Live so you need to follow the same steps as pushing to Dev (i.e. once you push you will need to go to Kudu, change the web.config version, debug and custom error mode). Pushing to Live is the same approach but of course your live site is going to incur some downtime. If you’ve practiced with a Staging site, you’ll know how much downtime to expect, in theory it could be as low as a couple minutes but of course if something goes wrong it could be for longer.

… And Hooray! You are live on v8 :)

Before you go, there’s a few things you’ll want to do:

  • log back into kudu on your live site and in your web.config turn off debug and change custom errors back to RemoteOnly
  • be sure to run “echo > deploy”
  • in kudu delete the temp file folder: App_Data/Temp
  • Rebuild your indexes via the back office dashboard
  • Rebuild your published caches via the back office dashboard

What if something goes wrong?

I mentioned above that you can revert to a working copy, but how? Well this happened to me since I don’t follow my own instructions and I forgot to get rid of the data types with Obsolete property editors on live which means all of my environments were not totally synced before I started since I had fixed that on Dev. When I pushed to live and then ran the upgrader, it told me that I had data types with old Obsolete property editors … well in that scenario there’s nothing I could do about it since I can’t login to the back office and change anything. So I had to revert the Live site to the commit before the merge from Dev –> Live. Luckily all database changes with the Upgrader are done in a transaction so your live data isn’t going to be changed unless the upgrader successfully completes.

To rollback, I logged into Kudu and on the home page there is a link to “Source control info” where you can get the git endpoint for your Live environment. Then I cloned that all down locally and reverted the merge, committed and pushed back up to the live site. Now the live site was just back to it’s previous v7 state and I could make the necessary changes. Once that was done, I reverted my revert commit locally and pushed back to Live, and went through the upgrade process again.

Next steps?

Now your site is live on v8 but there’s probably more to do for you solution. If you are like me, you will have a Visual Studio solution with a web application to power your website. I then run this locally and publish to my local file system – which just happens to be the location of my cloned git repo for my Umbraco Cloud Dev site, then I push those changes to Cloud. So now I needed to get my VS web application to produce the same binary output as Cloud. That took a little bit to figure out since Umbraco Cloud includes some extra DLLs/packages that are not included in the vanilla Umbraco Cms package, namely this one: “Serilog.Sinks.MSSqlServer - Version 5.1.3-dev-00232” so you’ll probably need to include that as a package reference to your site too.

That’s about as far as I’ve got myself, best of luck!

Articulate 4.0.0 released for Umbraco version 8

May 3, 2019 02:37
Articulate 4.0.0 released for Umbraco version 8

It’s finally out in the wild! Articulate 4.0.0 is a pretty huge release so here’s the rundown…

Installation

As a developer, my recommendation is to install packages with Nuget

PM > Install-Package Articulate -Version 4.0.0

If you install from Nuget you will not automatically get the Articulate data structures installed because Nuget can’t talk to your live website/database so once you’ve installed the package and run your site, head over to the “Settings” section and you’ll see an “Articulate” dashboard there, click on the “Articulate Data Installer” tile and you’ll get all the data structures and a demo blog installed.

Alternatively you can install it directly from the Umbraco back office by searching for “Articulate” in the Packages section, or you can download the zip from https://our.umbraco.com/packages/starter-kits/articulate/ and install that in the Umbraco back office. If you install this way all of the data structures will be automatically installed.

Upgrading

I have no official documentation or way of doing this right now 😉. I’ve written up some instructions on the GitHub release here but essentially it’s going to require you to do some investigations and manual updates yourselves. There’s very little schema changes and only small amount of model changes so it shouldn’t be too painful. Good luck!

(note: I have yet to give it a try myself)

Support for Umbraco 8

I think it will come as no surprise that Articulate 4.0.0 is not compatible with any Umbraco v7 version. Articulate 4.0.0 requires a minimum of Umbraco 8.0.2. Moving forward I will only release more Articulate 3.x versions to support v7 based on community pull requests, my future efforts  will be solely focused on 4.x and above for Umbraco 8+.

Theme, Features + Bug fixes

There are several nice bug fixes in this release including a few PR sent in by the community – THANK YOU! 🤗

As for features, this is really all about updating the Themes. Previously Articulate shipped with 6 themes and all of them had a vast range of different features which I never really liked so I spent some time enhancing all of the ones I wanted to keep and made them look a bit prettier too. I’ve removed my own “Shazwazza” theme since it was way outdated to my own site here, plus I don’t really want other people to have the exact same site as me ;) But since that was the most feature rich theme I had to upgrade other ones. I also removed the old ugly Edictum them… pretty sure nobody used that one anyways.

Here’s the theme breakdown (it’s documented too)

image

I’ve also updated the default installation data to contain more than one blog post and an author profile so folks can see a better representation of the blog features on install. And I updated the default images and styling so it has a theme (which is Coffee ☕) and is less quirky (no more bloody rabbit or horse face photos 😛)

Here’s the breakdown of what they look like now…

VAPOR

This is the default theme installed, it is a very clean & simple theme. Originally created by Seth Lilly

theme-vapor

Material

This is based of of Google's material design lite and is based off their their blog template.

theme-material

Phantom

Original theme for Ghost can be found here: https://github.com/Bartinger/phantom/. A nice simple responsive theme.

theme-phantom

Mini

The original author's site can be found here: http://www.thyu.org/www/ but unfortunately their demo site for the Ghost theme is down. The theme's repository is here https://github.com/thyu/minighost.

theme-mini

 

Hope you enjoy the updates!

Need to remove an auto-routed controller in Umbraco?

April 11, 2019 05:06
Need to remove an auto-routed controller in Umbraco?

Umbraco will auto-route some controllers automatically. These controllers are any MVC SurfaceControllers or WebApi UmbracoApiController types discovered during startup. There might be some cases where you just don’t want these controllers to be routed at all, maybe a package installs a controller that you’d rather not have routable or maybe you want to control if your own plugin controllers are auto-routed based on some configuration.

The good news is that this is quite easy by just removing these routes during startup. There’s various ways you could do this but I’ve shown below one of the ways to interrogate the routes that have been created to remove the ones you don’t want:

Version 8


//This is required to ensure this composer runs after
//Umbraco's WebFinalComposer which is the component
//that creates all of the routes during startup    
[ComposeAfter(typeof(WebFinalComposer))]
public class MyComposer : ComponentComposer<MyComponent>{ }

//The component that runs after WebFinalComponent
//during startup to modify the routes
public class MyComponent : IComponent
{
    public void Initialize()
    {
        //list the routes you want removed, in this example
        //this will remove the built in Umbraco UmbRegisterController
        //and the TagsController from being routed.
        var removeRoutes = new[]
        {
            "/surface/umbregister",
            "/api/tags"
        };

        foreach (var route in RouteTable.Routes.OfType().ToList())
        {
            if (removeRoutes.Any(r => route.Url.InvariantContains(r)))
                RouteTable.Routes.Remove(route);
        }
    }

    public void Terminate() { }
}

Version 7

public class MyStartupHandler : ApplicationEventHandler
{
    protected override void ApplicationStarted(
        UmbracoApplicationBase umbracoApplication,
        ApplicationContext applicationContext)
    {

        //list the routes you want removed, in this example
        //this will remove the built in Umbraco UmbRegisterController
        //and the TagsController from being routed.
        var removeRoutes = new[]
        {
            "/surface/umbregister",
            "/api/tags"
        };

        foreach(var route in RouteTable.Routes.OfType<Route>().ToList())
        {
            if (removeRoutes.Any(r => route.Url.InvariantContains(r)))
                RouteTable.Routes.Remove(route);
        }
    }
}

Umbraco Down Under Festival 2019

March 4, 2019 05:00
Umbraco Down Under Festival 2019

I had the pleasure of attending and speaking at this year’s Umbraco Down Under Festival which was fantastic! Thanks to everyone at KØBEN digital for putting on such a nice event as well to all of the sponsors Zero Seven, Tea Commerce and Luminary in helping make it all happen. And what great timing to have an Umbraco festival just after Umbraco v8 is launched! Big thanks to Niels Hartvig for coming all the way from Denmark, it certainly means a lot to us Australians (yes, I am one too even with this Canadian accent!).

Hackathon

We had quite a few people at the Hackathon this year (18!) and we were able to close 3 issues and merge 6 Pull Requests along with finding and submitting 4 other issues, all for Umbraco v8, great work! Looking forward to seeing the Australian community submit even more PRs for v8 and hope to see you all at the Australian Umbraco meetups :)

image

Slide Deck

imageMy talk this year was on Umbraco Packages in v8 though much of it was really about transitioning to v8 in general.

Here is the rendered PDF version of my slides, of course it doesn’t come with all of the nice transitions but it’s better than nothing. My slides were done with the brilliant GitPitch service which I absolutely love. Not only does it make presenting code so much nicer/easier, it just makes sense to me as a developer since I can just write my slides in Mardown and style them with css. Plus having your slide deck in Git means making new slides out of old slides quite nice since all your history is there!

I tried to break down the talk into 3 sections: Migrating, Building and Packaging.

Migrating

“Migrating” was a bit of a walk through between some fundamental things that have changed between v7 and v8 that not only package developers will need to be aware of but anyone making the transition from v7 to v8.

Building

“Building” showcased some new features for packages and v8, though I didn’t talk about one of the major v8 features: Content Apps, because Robert Foster was already doing a talk all about them in the morning. Instead I focused on how Dashboards work in v8 and a couple currently undisclosed v8 features: Full Screen Sections (sans c#) and Package Options.

Packaging

“Packaging” may have been a bit rushed but I thought I was going to go overtime :P I talked about the new packager UI in v8 and that it is certainly possible to build packages for CI/CD integration with PowerShell scripts to build an Umbraco package from a command line. I’d like to make this far more straight forward than it is now which is something I’ll work on this year. You can find this PowerShell script here and a newer example in Articulate here. Lastly I mentioned that there is a disconnect between the Umbraco package format and the Nuget package format with regards to installing Umbraco data and that it would be nice to make both of these work seamlessly as one thing… and this is certainly possible. I created a PR a very long time ago to address this called Package Migrations (even though it says Morten made it … he actually just submitted the PR ;) ). I need to write a blog post about what this is and how it is intended to work so we can hopefully get some traction on this this year. The very brief overview is that package updates would work very similarly to Umbraco updates, if an update is detected that requires a migration, the installer will execute to guide the user through the process and to provide UI feedback if anything fails along the way. This way package developers can properly leverage the Migrations system built into Umbraco and Umbraco data will happily be installed on startup even if you install a package from Nuget.

The main barrier currently is that Umbraco Cloud will need to natively support it otherwise people will get the installer screen on every environment when they push a package update upstream which is not great, Umbraco Cloud should instead run the migration on the upstream environment in the background just like it does with Umbraco updates.

Lastly I talked about how Articulate currently manages this situation between the Umbraco package format and the Nuget package format.

UDUF 2020

Looks like UDUF is moving to Sydney next year, so we’ll so you all there!

Configuring Azure Active Directory login with Umbraco Members

February 18, 2019 02:09
Configuring Azure Active Directory login with Umbraco Members

This post is about configuring Azure Active Directory with Umbraco Members (not Users), meaning this is for your front-end website, not the Umbraco back office. I did write up a post about Azure AD with back office users though, so if that is what you are looking for then this is the link.

Install the Nuget packages

First thing to do is get the UmbracoIdentity package installed.

PM > Install-Package UmbracoIdentity

(This will also install the UmbracoIdentity.Core base package)

This package installs some code snippets and updates your web.config to enable ASP.Net Identity for Umbraco members. Umbraco ships with the old and deprecated ASP.Net Membership Providers for members and not ASP.Net Identity so this package extends the Umbraco CMS and the Umbraco members implementation to use ASP.Net Identity APIs to interact with the built in members data store. Installing this package will remove the (deprecated) FormsAuthentication module from your web.config and it will no longer be used to authenticate members, so the typical members snippets built into Umbraco macros will not work. Instead use the supplied snippets shipped with this package.

To read more about this package see the GitHub repo here.

Next, the OpenIdConnect package needs to be installed

PM > Install-Package Microsoft.Owin.Security.OpenIdConnect

Configure Azure Active Directory

Head over to the Azure Active Directory section on the Azure portal, choose App Registrations (I’m using the Preview functionality for this) and create a New registration

image

Next fill out the app details

image

You may also need to enter other redirect URLs depending on how many different environments you have. All of these URLs can be added in the Authentication section of your app in the Azure portal.

For AAD configuration for front-end members, the redirect Urls are just your website’s root URL and it is advised to keep the trailing slash.

Next you will need to enable Id Tokens

image

Configure OpenIdConnect

The UmbracoIdentity package will have installed an OWIN startup class in ~/App_Start/UmbracoIdentityStartup.cs (or it could be in App_Code if you are using a website project). This is how ASP.Net Identity is configured for front-end members and where you can specify the configuration for different OAuth providers. There’s a few things you’ll need to do:

Allow external sign in cookies

If you scroll down to the ConfigureMiddleware method, there will be a link of code to uncomment: app.UseExternalSignInCookie(DefaultAuthenticationTypes.ExternalCookie); this is required for any OAuth providers to work.

Enable OpenIdConnect OAuth for AAD

You’ll need to add this extension method class to your code which is some boiler plate code to configure OpenIdConnect with AAD:

public static class UmbracoADAuthExtensions
{
    public static void ConfigureAzureActiveDirectoryAuth(this IAppBuilder app,
        string tenant, string clientId, string postLoginRedirectUri, Guid issuerId,
        string caption = "Active Directory")
    {
        var authority = string.Format(
            System.Globalization.CultureInfo.InvariantCulture,
            "https://login.windows.net/{0}",
            tenant);

        var adOptions = new OpenIdConnectAuthenticationOptions
        {
            ClientId = clientId,
            Authority = authority,
            RedirectUri = postLoginRedirectUri
        };

        adOptions.Caption = caption;
        //Need to set the auth type as the issuer path
        adOptions.AuthenticationType = string.Format(
            System.Globalization.CultureInfo.InvariantCulture,
            "https://sts.windows.net/{0}/",
            issuerId);
        app.UseOpenIdConnectAuthentication(adOptions);
    }
}

Next you’ll need to call this code, add the following line underneath the app.UseExternalSignInCookie method call:

app.ConfigureAzureActiveDirectoryAuth(
    ConfigurationManager.AppSettings["azureAd:tenantId"],
    ConfigurationManager.AppSettings["azureAd:clientId"],
    //The value of this will need to change depending on your current environment
    postLoginRedirectUri: ConfigurationManager.AppSettings["azureAd:redirectUrl"],
    //This is the same as the TenantId
    issuerId: new Guid(ConfigurationManager.AppSettings["azureAd:tenantId"]));

Then you’ll need to add a few appSettings to your web.config (based on your AAD info):

<add key="azureAd:tenantId" value="YOUR-TENANT-ID-GUID" />
<add key="azureAd:clientId" value="YOUR-CLIENT-ID-GUID" />
<add key="azureAd:redirectUrl" value="http://my-test-website/" />

Configure your Umbraco data

The UmbracoIdentity repository has the installation documentation and you must follow these 2 instructions, and they are very simple:

  1. You need to update your member type with the securityStamp property
  2. Create the Account document type

Once that is done you will have an Member account management page which is based off of the installed views and snippets of the UmbracoIdentity package. This account page will look like this:

image

As you can see the button text under the “Use another service to log in” is the login provider name which is a bit ugly. The good news is that this is easy to change since this is just a partial view that was installed with the UmbracoIdentity package. You can edit the file: ~/Views/UmbracoIdentityAccount/ExternalLoginsList.cshtml, the code to render that button text is using @p.Authentication provider but we can easily change this to @p.Caption which is actually the same caption text used in the extension method we created. So the whole button code can look like this instead:


<button type="submit" class="btn btn-default"
        id="@p.AuthenticationType"
        name="provider"
        value="@p.AuthenticationType"
        title="Log in using your @p.Caption account">
    @p.Caption
</button>

This is a bit nicer, now the button looks like:

image

The purpose of all of these snippets and views installed with UmbracoIdentity is for you to customize how the whole flow looks and works so you’ll most likely end up customizing a number of views found in this folder to suit your needs.

That’s it!

Once that’s all configured, if you click on the Active Directory button to log in as a member, you’ll be brought to the standard AAD permission screen:

image

Once you accept you’ll be redirect back to your Account page:

image

Any customizations is then up to you. You can modify how the flow works, whether or not you accepting auto-linking accounts (like in the above example), or if you require a member to exist locally before being able to link an OAuth account, etc… All of the views and controller code in UmbracoIdentity is there for you to manipulate. The main files are:

  • ~/Views/Account.cshtml
  • ~/Views/UmbracoIdentityAccount/*
  • ~/Controllers/UmbracoIdentityAccountController.cs
  • ~/App_Start/UmbracoIdentityStartup.cs


Happy coding!

Easily setup your Umbraco installation with IoC / Dependency Injection

December 19, 2017 04:00
Easily setup your Umbraco installation with IoC / Dependency Injection

Umbraco supports allowing you to setup and configure any IoC container type that you want to use in your application. For a while now there’s been some sparse documentation on how to achieve this which you can find here: https://our.umbraco.org/Documentation/reference/using-ioc. As the Umbraco core codebase evolves, sometimes a new non-parameterless constructor is added to a class and sometimes this can confuse an existing container that you’ve setup. For many folks, fixing these errors after upgrading is a trial and error experience until they track down the dependency that is now missing from their container and finally add it.

Simone, a very helpful Umbracian, made a comment on the issue tracker and it’s something that is just so obvious  (http://issues.umbraco.org/issue/U4-9562#comment=67-41855):

I think the point here is:  as user of a framework, I shouldn't need to wire up dependencies for internals of the framework myself. I should only bother about my own dependencies.
Maybe Umbraco should ship a small extension method for each of the main IoC container out there which wires up all the internals.
Or come with a IoC container out of the box and then everyone using umbraco have to use that one.

Yes of course this should be done!

A new community project: Our.Umbraco.IoC

I decided to get the ball rolling with this one and have setup a new Git repo here: https://github.com/Shazwazza/Our.Umbraco.IoC 

Currently there are 2 different container configurations committed and working for Autofac and LightInject.

I’ve added some notes to the readme on how to contribute and get started so I’m hoping that some folks can create some Pull Requests to add support for more containers. The project is very easy to navigate, it’s got a build script and nuget packages setup.

Give it a go!

I’ve published some beta’s to Nuget:

Install-Package Our.Umbraco.IoC.Autofac
Install-Package Our.Umbraco.IoC.LightInject

You can actually install both and test each one independently by disabling them by an appSetting:

<add key="Our.Umbraco.IoC.Autofac.Enabled" value="false" />

Or

<add key="Our.Umbraco.IoC.LightInject.Enabled" value="false" />

If this config key doesn’t exist, it will assume the value is “true”

Using the container

Once you’ve got your desired package installed, it will be active in your solution (unless you disable it via config). At this stage you’ll want to add your own bits to the container, so here’s how you do that:

  • Create a custom Umbraco ApplicationEventHandler
  • Override ApplicationInitialized – we do this in this phase to bind to the container event before the container is built which occurs in the ApplicationStarted phase
  • Bind to the container event
  • add any custom services you want to the container

Here’s a full working example showing various techniques and includes the syntax for both LightInject and Autofac. In this example we’re registering a IServerInfoService as a request scoped object since it requires an HttpRequestBase. NOTE: That the basic web objects are already registered in the containers (such as HttpContextBase, HttpRequestBase, etc…)


public class MyUmbracoStartup : ApplicationEventHandler
{
    protected override void ApplicationInitialized(UmbracoApplicationBase umbracoApplication, ApplicationContext applicationContext)
    {
        //If you are using Autofac:
        AutofacStartup.ContainerBuilding += (sender, args) =>
        {
            //add our own services
            args.Builder.RegisterControllers(typeof(TestController).Assembly);
            args.Builder.RegisterType().As().InstancePerRequest();
        };

        //If you are using LightInject:
        LightInjectStartup.ContainerBuilding += (sender, args) =>
        {
            //add our own services
            args.Container.RegisterControllers(typeof(TestController).Assembly);
            args.Container.Register(new PerRequestLifeTime());
        };
    }
}

//service
public interface IServerInfoService
{
    string GetValue();
}

//implementation of the service
public class ServerInfoService : IServerInfoService
{
    private readonly HttpRequestBase _umbCtx;

    //requires a request based object so this must be scoped to a request
    public ServerInfoService(HttpRequestBase umbCtx)
    {
        _umbCtx = umbCtx;
    }

    public string GetValue()
    {
        var sb = new StringBuilder();
        sb.AppendLine("Server info!").AppendLine();
        foreach (var key in _umbCtx.ServerVariables.AllKeys)
        {
            sb.AppendLine($"{key} = {_umbCtx.ServerVariables[key]}");
        }
        return sb.ToString();
    }
}

public class TestController : SurfaceController
{
    private readonly IServerInfoService _serverInfoService;

    public TestController(IServerInfoService serverInfoService, UmbracoContext umbCtx): base(umbCtx)
    {
        _serverInfoService = serverInfoService;
    }

    //see /umbraco/surface/test/index to see the result
    public ActionResult Index()
    {
        return Content(_serverInfoService.GetValue(), "text/plain");
    }
}

Happy holidays!

Isolated WebApi attribute routing

January 17, 2016 12:39

Attribute routing in ASP.Net WebApi is great and makes routing your controllers quite a bit more elegant than writing routes manually. However one problem I have with it is that it is either “on” or “off” at an application level.  There is no way for a library developer to tell ASP.Net to create routes based on attributes for specific controllers or assemblies without forcing the consumer of that library to enable Attribute Routing for the whole application. In many cases this might not matter, but if you are creating a package or library of that contains it’s own API routes, you probably don’t want to interfere with a developers’ normal application setup. There should be no reason why they need to be forced to turn on attribute routing in order for your product to work, and similarly they might not want your routes automatically enabled.

The good news is that this is possible. With a bit of code, you can route your own controllers with attribute routing and be able to turn them on or off without affecting the default application attribute routes. A full implementation of this has been created for the Umbraco RestApi project so I’ll reference that source in this post for the following code examples.

Show me the code

They key to getting this to work is: IDirectRouteProvider, IDirectRouteFactory

The first thing we need is a custom IDirectRouteFactory which is actually a custom attribute. I’ve called this CustomRouteAttribute  but you could call it whatever you want.

[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method, AllowMultiple = true)]
public class CustomRouteAttribute : Attribute, IDirectRouteFactory

This custom attribute just wraps the default WebApi RouteAttribute’s IDirectRouteFactory implementation so we don’t have to re-write any code for that.

(see full implementation here)

Next we’ll create a custom IDirectRouteProvider:

/// <summary>
/// This is used to lookup our CustomRouteAttribute instead of the normal RouteAttribute so that 
/// we can use the CustomRouteAttribute instead of the RouteAttribute on our controlles so the normal
/// MapHttpAttributeRoutes method doesn't try to route our controllers - since the point of this is
/// to be able to map our controller routes with attribute routing explicitly without interfering
/// with default application routes.
/// </summary>
public class CustomRouteAttributeDirectRouteProvider : DefaultDirectRouteProvider
{
    private readonly bool _inherit;

    public CustomRouteAttributeDirectRouteProvider(bool inherit = false)
    {
        _inherit = inherit;
    }

    protected override IReadOnlyList<IDirectRouteFactory> GetActionRouteFactories(HttpActionDescriptor actionDescriptor)
    {
        return actionDescriptor.GetCustomAttributes<CustomRouteAttribute>(inherit: _inherit);
    }
}

So far this is all pretty straight forward so far but here’s where things start to get interesting. Because we only want to create routes for specific controllers, we need to use a custom IHttpControllerTypeResolver. However, since the HttpConfiguration instance only contains a single reference to the IHttpControllerTypeResolver we need to do some hacking. The route creation process for attribute routing happens during the HttpConfiguration initialization so we need to create an isolated instance of HttpConfiguration, set it up with the services we want to use, initialize it to create our custom routes and assign those custom routes back to the main application’s HttpConfiguration.

Up first, we create a custom IHttpControllerTypeResolver to only resolve the controller we’re looking for:

public class SpecificControllerTypeResolver : IHttpControllerTypeResolver
{
    private readonly IEnumerable<Type> _controllerTypes;

    public SpecificControllerTypeResolver(IEnumerable<Type> controllerTypes)
    {
        if (controllerTypes == null) throw new ArgumentNullException("controllerTypes");
        _controllerTypes = controllerTypes;
    }

    public ICollection<Type> GetControllerTypes(IAssembliesResolver assembliesResolver)
    {
        return _controllerTypes.ToList();
    }
}

Before we look at initializing a separate instance of HttpConfiguration, lets look at the code you’d use to enable all of this in your startup code:

//config = the main application HttpConfiguration instance
config.MapControllerAttributeRoutes(
    routeNamePrefix: "MyRoutes-",
    //Map these explicit controllers in the order they appear
    controllerTypes: new[]
    {                    
        typeof (MyProductController),
        typeof (MyStoreController)
    });

The above code will enable custom attribute routing for the 2 specific controllers. These controllers will be routed with attribute routing but instead of using the standard [Route] attribute, you’d use our custom [CustomRoute] attribute. The MapControllerAttributeRoutes extension method is where all of the magic happens, here’s what it does:

  • Iterates over each controller type
  • Creates an instance of HttpConfiguration
  • Sets it’s IHttpControllerTypeResolver instance to SpecificControllerTypeResolver for the current controller iteration (The reason an instance of HttpConfiguration is created for each controller is to ensure that the routes are created in the order of which they are specified in the above code snippet)
  • Initialize the HttpConfiguration instance to create the custom attribute routes
  • Copy these routes back to the main application’s HttpConfguration route table

You can see the full implementation of this extension method here which includes code comments and more details on what it’s doing.  The actual implementation of this method also allows for some additional parameters and callbacks so that each of these routes could be customized if required when they are created.

 

There is obviously a bit of code involved to achieve this and there could very well be a simpler way, however this implementation does work rather well and offers quite a lot of flexibility. I’d certainly be interested to hear if other developers have figured this out and what their solutions were.

Powershell script to create an Umbraco package in Umbraco’s native file format

November 11, 2015 10:35

logo-powershell-umbracoSince I like using Powershell for my build scripts for various projects I thought it would be handy to create a Powershell script to create an Umbraco package in it’s native package format. I’ve added this script template to my GitHub Umbraco Scripts repository which you can see here: http://bit.ly/1kM9g9g

I’ve tried to add a bit of documentation to this script and in theory you should only need to set up the paths properly. You’ll also need to ensure that you initially create the package in the back office of Umbraco in order to generate the createdPackages.config as mentioned in the script notes.

I use this script in Articulate’s build script so at least I can also provide an example of using it which you can see here: https://github.com/Shazwazza/Articulate/blob/master/build/build.ps1