1
votes

I have created a custom module as a PowerShell class following, roughly, the instructions available at Writing a custom DSC resource with PowerShell classes. The intent is to connect to Azure File Storage and download some files. I am using Azure Automation DSC as my pull server.

Let me start by saying that, when run through the PowerShell ISE, the code works a treat. Something goes wrong when I upload it to Azure though - I get the error Unable to find type [CloudFileDirectory]. This type specifier comes from assemblies referenced in through the module Azure.Storage which is definitely in my list of automation assets.

At the tippy top of my psm1 file I have

Using namespace Microsoft.WindowsAzure.Storage.File

[DscResource()]
class tAzureStorageFileSync
{
    ...
    # Create the search context
    [CloudFileDirectory] GetBlobRoot()
    {
        ...
    }
    ...
}

I'm not sure whether this Using is supported in this scenario or not, so let's call that Question 1

To date I have tried:

  • Adding RequiredModules = @( "Azure.Storage" ) to the psd1 file
  • Adding RequiredAssemblies = @( "Microsoft.WindowsAzure.Storage.dll" ) to the psd1 file
  • Shipping the actual Microsoft.WindowsAzure.Storage.dll file in the root of the module zip that I upload (that has a terrible smell about it)

When I deploy the module to Azure with New-AzureRmAutomationModule it uploads and processes just fine. The Extracting activities... step works and gives no errors.

When I compile a configuration, however, the compilation process fails with the Unable to find type error I mentioned.

I have contemplated adding an Import-Module Azure.Storage above the class declaration, but I've never seen that done anywhere else before.

Question 2 Is there a way I can compile locally using a similar process to the one used by Azure DSC so I can test changes more quickly?

Question 3 Does anyone know what is going wrong here?

3

3 Answers

0
votes

Question 1/3: If you create classes in powershell and use other classes within, ensure that these classes are present BEFORE loading the scriptfile that contains your new class.

I.e.: Loader.ps1:

Import-Module Azure.Storage
. .\MyDSC-Class.ps1

Powershell checks if it finds all types you refer while interpreting the script, so all types must be loaded before that happens. You can do this by creating a scriptfile that loads all dependencies first and loads your script after that.

0
votes

For question 2, if you register your machine as a hybrid worker you'll be able to run the script faster and compile locally. (For more details on hybrid workers, https://azure.microsoft.com/en-us/documentation/articles/automation-hybrid-runbook-worker/).

If you want an easy way to register the hybrid worker, you can run this script on your local machine (https://github.com/azureautomation/runbooks/blob/master/Utility/ARM/New-OnPremiseHybridWorker.ps1). Just make sure you have WMF 5 installed on your machine beforehand.

0
votes

For authoring DSC configurations and testing locally, I would look at the Azure Automation ISE Add-On available on https://www.powershellgallery.com/packages/AzureAutomationAuthoringToolkit/0.2.3.6 You can install it by running the below command from an Administrator PowerShell ISE window. Install-Module AzureAutomationAuthoringToolkit -Scope CurrentUser

For loading libraries, I have also noticed that I need to call import-module in order to be able to call methods. I need to do some research to determine the requirement for this. You can see an example I wrote to copy files from Azure Storage using a storage key up on https://github.com/azureautomation/modules/tree/master/cAzureStorage

As you probably don't want to have to deploy the storage library on all nodes, I included the storage library in the sample module above so that it will be automatically distributed to all nodes by the automation service.

Hope this helps, Eamon