MSDyn365FO

Interpreting compiler results in D365FO using PowerShell

When you build your code, the results are hard to interpret and are being capped at 1000 entries per category within the Visual Studio error pane. The compiler does generate output files with more valuable content within each package. We have written PowerShell code for analyzing and interpreting compiler results of Microsoft Dynamics 365 for Finance and Operations in a more meaningful way.

The BuildModelResult.LOG and XML files have the details of your errors, warnings and tasks. Running the following script parses these files and counts the warnings and errors, to get a better idea of the remaining work during your implementation and upgrade:

$displayErrorsOnly = $false # $true # $false
$rootDirectory = "C:\AOSService\PackagesLocalDirectory"

$results = Get-ChildItem -Path $rootDirectory -Filter BuildModelResult.log -Recurse -Depth 1 -ErrorAction SilentlyContinue -Force
$totalErrors = 0
$totalWarnings = 0

foreach ($result in $results)
{
    try
    {
        $errorText = Select-String -LiteralPath $result.FullName -Pattern ^Errors: | ForEach-Object {$_.Line}
        $errorCount = [int]$errorText.Split()[-1]
        $totalErrors += $errorCount

        $warningText = Select-String -LiteralPath $result.FullName -Pattern ^Warnings: | ForEach-Object {$_.Line}
        $warningCount = [int]$warningText.Split()[-1]
        $totalWarnings += $warningCount

        if ($displayErrorsOnly -eq $true -and $errorCount -eq 0)
        {
            continue
        }

        Write-Host "$($result.DirectoryName)\$($result.Name) " -NoNewline
        if ($errorCount -gt 0)
        {
            Write-Host " $errorText" -NoNewline -ForegroundColor Red
        }
        if ($warningCount -gt 0)
        {
            Write-Host " $warningText" -ForegroundColor Yellow
        }
        else
        {
            Write-Host
        }
    }
    catch
    {
    Write-Host
    Write-Host "Error during processing"
    }
}

Write-Host "Total Errors: $totalErrors" -ForegroundColor Red
Write-Host "Total Warnings: $totalWarnings" -ForegroundColor Yellow

The compiler results are displayed in the following format as an overview:

compiler results

If you want to do a detailed analysis, we also have PowerShell scripts prepared for extracting the errors and saving them in a CSV file for better processing. Opening it with Excel allows you to format them into a table.

We typically copy all error texts to a new sheet, run the duplicate entries removal, then do a count on what is the top ranking error to see if we have any low-hanging fruits to be fixed.

=SUMPRODUCT(('JADOperation-Warning'!$D$2:$D$1721=Sheet1!A2)+0)

You could be very efficient about deciding what things to fix first and next, and is easier to delegate tasks this way.

Source code for all four scripts are available on GitHub.

Archiving SQL database backups using Azure blob storage

It is a good practice to keep multiple copies of our most precious data. By using on-premises SQL Server databases for AX 2012 or Dynamics 365 Finance and Operations, archiving SQL database backups to offsite-locations are a must. I have built automation for archiving SQL database backups using Azure Blob Storage.

Overview of the processes

Maintenance regime

Our maintenance regime looks like the following:

  • 1x Weekly Full backup
  • 6x Daily Differential backup
  • 15 minute Transactional log backups

They are captured locally on the primary SQL instance, to keep the timestamps for last successful backups in our AlwaysOn cluster. Then we move the files to a shared network storage, which is visible to both High Availability sites, in case there is an outage and we need to a fail over and restore data.

In case of a natural disaster due to the close geographical proximity of the sites we needed an additional layer of safety.

Archiving offsite

Every night we are running a PowerShell script that uses the AzCopy utility. It is uploading our backup files on a Microsoft Azure cloud storage account.

You are paying for the network usage (IO) and the size occupied on the disks, so it is a good idea to have some sort of housekeeping. Our solution was to use an Azure RunBook to determine what to keep and what to delete. The current setup is to have 1 full backup file for a year available every quarter (4x 400 GB), and keep all full / differential / transactional files for the last month (4x 400 GB + 26x 10 GB).

This puts the total size occupied around 4 TB, and costs about 35 GBP a month using a cold storage. This price could go up if you also want to utilize a hot tier storage for the latest backup set. That is useful if you want to come back from the cloud on a short notice.

(more…)

Dynamics Zero to Hero introduction vlog

As it has been announced last week the Dynamics Zero to Hero channel is now up and running on YouTube. The brief introduction video is now up, and there are more to follow in the upcoming days and weeks:

Being nervous does show, and it does not help with my accent :)
I am sure the content is going to make up for the lack of vlogging experience on the long run, hopefully you will like it.

If you have topics you would like me to touch up on, feel free to leave a comment, or send an e-mail.

Dynamics 365 Dubai Summit 2020

With only 1 week left until the Microsoft Dynamics 365 Dubai Summit 2020, I would like to remind you that there are still a couple of sets available for this amazing, free event.

Dubai Summit 2020

Dubai Summit 2020 website link

This is one of the largest conference in the region dedicated to Dynamics 365 with 5 tracks available in: Finance and Operations, Customer Engagement, Retail, Unified Operations and Power Platform.

It falls in line with the Dynamics 365 Saturdays event series, which has a goal of sharing the latest and greatest about our ERP platform and connected technologies freely. You can find more details at https://www.365portal.org/.

The Dubai Summit conference is a 365 Saturday on steroids and will be span across 3 days, 700 participants, with 29th of February being dedicated for various in-depth sessions held by fellow professionals and enthusiasts from Microsoft, Customers and Partners. Sunday will be a Power Platform bootcamp, and Monday will be a CDS core exam cram.

I will be presenting 2 sessions about the journey of moving the AX 2012 R3 implementation of JJ Food Services into the cloud. It will cover business and technical aspects and challenges of getting a large-scale enterprise system up and running within Azure.

Hope to see you all soon at the Dynamics 365 Dubai Summit on 29th of February!

Synchronize Models between workspaces in MSDyn365FO

For our upgrade project we have decided to create 4 packages/models to start with, which is enough to separate larger areas. Once the models have been created on my VM, I had to make it available for everyone. The goal was to synchronize models between workspaces of other developers.

Package / ModelDescription
JAD CoreCore application framework changes, essentially what is in Platform and Foundation
JAD OperationBusiness application customizations, later on we might create additional models for larger functionality
JAD IntegrationFor exchanging
JAD ReportingFor reporting, data warehousing, CDS and entities

The devil lies in the details and for some reason the model was not showing up after synch. It turns out that the model description sits in an XML file within the Descriptor folder. You need to manually include it in source control, as explained in the documentation as well. Once it is checked in, all you need to do to synchronize models between workspaces is to get latest changes, then click on Dynamics 365 > Model Management > Refresh models.

Synchronize models between workspaces

Go to Top