by Dejan Pajk Dejan Pajk No Comments

Ten Rules for a Successful Dynamics NAV Data Upgrade

Since I joined Fenwick Software in Oct 2015, I’ve completed five Dynamics NAV upgrade projects. Such projects typically start with the code upgrade or application upgrade which is followed by the data upgrade. In this blog, I’m sharing with you my take on the data upgrade process. Here are the top 10 rules that made my upgrade projects successful:

  1. Follow the official script – The latest data upgrade script can be retrieved from the Microsoft MSDN web sites (e.g. Upgrading the Data to Dynamics NAV 2017).
  2. Understand the steps – Following the right upgrade steps is a necessity, however we also need to have a good understanding of what is happening in NAV, so we can properly verify the result of each step and resolve errors that might occur during the upgrade. Blindly following the steps is a potential risk for the upgrade project.
  3. Take notes – I always start my upgrade by putting all the upgrade steps in OneNote. This allows me to track details against it (e.g. comments, statuses and durations) to add additional custom upgrade steps (e.g. post-upgrade setup) and to share the Notebook with the project team. This might seem overkill for some, however, there are many good reasons for keeping a detailed record. The main idea is to make an upgrade process as repeatable and predictable as possible and to eliminate any surprises that might occur during the cutover.
  4. Measure durations – Measuring the durations of each upgrade step is especially recommended when upgrading a large database. The information gained will not only help us to prepare a precise cutover plan, it is also a key input for upgrade optimisation. NAV offers standard commands (e.g. Get-NAVDataUpgrade) that return the status and duration of all upgrade functions that were invoked during the upgrade process.
  5. Optimise – If the total upgrade time exceeds the customer acceptable downtime window we have to find innovative ways to optimise the upgrade. Strategies range from replacing upgrade functions with T-SQL scrips and deleting obsolete data before the upgrade (e.g. Change Log and Archive tables) to improving server performance by adding memory or faster drives. We encountered an interesting challenge on the latest upgrade project, where the data upgrade took more than 3 days to complete. After measuring durations, we discovered that this was caused by the upgrade function which is renaming and updating User IDs. Simple optimisation cut down the duration of this function from 68 hours to 5 hours.
  6. Perform backups – We recommend that a full database backup be done after every major data upgrade step. Compression of backups should be enabled to increase the speed and save the disk space that will be required. In addition, database recovery model can be changed to Simple to free up log space. With this setup, we lose the ability to do a point in time recovery, therefore the most recent restore point will be the latest complete backup.
  7. Take the latest CU – Let’s say we are upgrading from NAV 2009 to NAV 2017. It is necessary to first upgrade data to NAV 2013 or to NAV 2015. My recommendation here is to upgrade to NAV 2015, simply because it is more mature version (e.g. contains upgrade codeunits and improved schema synchronisation). In addition, we should always take the latest cumulative update (CU) available, because it potentially contains upgrade toolkit bug-fixes.
  8. Use Upgrade Codeunits – Upgrade Codeunits were introduced with NAV 2015. Understanding and utilising upgrade codeunits is by far the most important data upgrade skill. Firstly, upgrade codeunits provide instructions for synchronising schema changes (e.g. to migrate data into upgrade tables) and secondly, upgrade codeunits contain upgrade functions, which migrate data from the upgrade tables into the new locations and also perform other data manipulations.
  9. Use Force only when required – Upgrade codeunits should provide explicit instructions for each table on how to handle data changes during the schema synchronization. These instructions are defined in TableSyncSetup functions. Synchronising the schema with option Force should only be used in rare cases (e.g. when deleting upgrade tables).
  10. Delete obsolete tables – One of the last data upgrade steps is to delete upgrade objects. In addition, we should also delete all old unused tables that were brought across from previous version to keep the upgraded database tidy and clean.

Microsoft used to release a new version of NAV every few years and the best practice for customers was to upgrade every second release. This has dramatically changed since Microsoft moved to annual version releases and monthly cumulative updates. In order to leverage new functionality NAV customers have to re-think upgrades and shift to frequent upgrade and update cycles.

Fortunately, the good news is that upgrading is no longer the time consuming and painful process it was in the past. Microsoft has enhanced the power and flexibility of the tools it provides to help us do upgrades more efficiently. The Fenwick Software team is highly specialised in Dynamics NAV upgrades. We are following the latest upgrade best practices and using automated tools which have saved our clients significant amounts of time and money. Get in touch if you would like more information around NAV 2017 or to get a free quote for your NAV upgrade project.

Link to the original article:

by Dejan Pajk Dejan Pajk No Comments

Using Query Object to Improve NAV Performance

The aim of this blog is to introduce Query Object and to provide an overview of different scenarios where queries can be used to improve NAV performance.

Query Object type was introduced back in NAV 2013. It gives us the opportunity to define and call T-SQL SELECT statements directly from NAV. Using queries can bring many performance benefits. Despite that, they are still rarely used.

One of the most common operations in a relational database is joining two or more tables. The join operation has traditionally been done in C/AL by record looping. Queries allow us to produce a dataset that is the result of a join operation between two or more tables.

The main advantages of queries are:

  • One SQL request – Queries allow NAV to retrieve all the data in one request.
  • Support for FlowFields – A sub-query is automatically added to the SQL statement to retrieve each FlowField in a query.
  • Totals and Group by – Queries naturally support totals and grouping of data. What has traditionally been done with temporary tables can now be replaced by simple queries. In addition, NAV Server will automatically use a SIFT indexes to optimise the calculation of totals where possible.
  • Limiting dataset – Queries enable us to select only the columns we need in the result (not the entire record), we can limit the number of rows by specifying filters.

There are some useful examples of how queries can be used in NAV:

  • Query as a page data source – To display the result of a query on NAV page you have to copy query result data in a temporary table, selected as a page source table. This design approach can be seen on the standard NAV Page 9126 “Lot Numbers by Bin FactBox”.
  • Query as a report data source – Reports, similar to pages, cannot natively use a query as a data source. As a workaround, we must use an Integer table to loop through the query dataset and store the result in variables, which are used as a report dataset. Standard NAV Report 19 “VAT- VIES Declaration Tax Auth” is designed by following this approach.
  • Query in C/AL functions – In standard NAV 2017, queries improve performance in more than 40 functions. Queries are used to calculate amounts (e.g. Customer Overdue Amounts on the Customer Statistics page), to detect duplicate records, to replace nested loops (e.g. when matching bank reconciliation lines) etc.
  • Exposing data as an OData web service – You can register and publish a query as a web service in the same way that you can register and publish pages or codeunits. After you expose a query as a web service, you can import it into other applications, such as Power BI.
  • Saving a query as an .xml or .csv file – You can use the SAVEASXML function to create an .xml file that contains the resulting dataset of a query. You can use the .xml file to integrate with external applications.
  • Creating charts – NAV charts can be based on a query instead of a table.

A potential downside of using queries is that NAV does not do any caching for the result sets. When you run a query, NAV always gets the data directly from SQL Server. This limitation should be carefully considered when you design your solutions with queries. As a general recommendation, you should always measure the performance of your solution before and after the change. You can monitor how your application code performs with Dynamics NAV Application Profiler tool. This aside, queries are very useful and can provide many benefits.

Link to the original article:

by Dejan Pajk Dejan Pajk No Comments

Introducing Power BI

Power BI is a Business Intelligence solution provided by Microsoft. It consists of a Windows desktop application called Power BI Desktop; online Software as a Service (SaaS) called the Power BI service; and mobile Power BI apps available on Windows, iOS and Android devices.

Power BI is an evolution of the add-ins previously available in Excel: Power Pivot, Power Query and Power View. You can use Power BI with or without Excel which means that you are no longer dependent on the version of Microsoft Office installed at your company.

Power BI provides connectors that make it possible to easily access a whole range of data sources, including on-premises databases, Excel workbooks, and over 59 cloud services. In addition, Power BI comes with Content Packs—pre-built reports and dashboards for the popular cloud services such as Google Analytics, Salesforce, MailChimp or Zendesk.

If you are a Dynamics NAV user, you can connect to the Power BI Content Pack for Dynamics NAV and start exploring your own data from an out-of-the-box dashboard showcasing important metrics on sales, monthly goals, revenues vs. expenses and more.

Power BI provides interactive visualizations with self-service business intelligence capabilities, where end users can create reports and dashboards themselves, instead of depending on technical staff.

With Power BI, you have the ability to carry out analysis of your data by asking questions, in plain English—no special code or syntax is required. This feature is called natural-language queries, and with it you can ask Power BI to perform tasks in much the same way you would ask one of your colleagues, for example: “Show sales 2015 by brand”.

Another feature worth mentioning is Quick Insights. Power BI can search a data set for interesting patterns and provide you with a list of charts to help you to better understand your data. The basic idea is that Power BI can use artificial intelligence to analyse your data, searching for some useful or interesting patterns. It uses very sophisticated algorithms whose speed depends on the size and complexity of the dataset.

You can create a Power BI Group Workspace to collaborate with your colleagues on dashboards, reports, and datasets. When the reports and dashboards are ready to be distributed, you can:

  • share them via Power BI Service or Power BI mobile
  • create an Organisational Content Pack (a collection of reports) and publish the reports
  • publish a dashboard to the web (if intended for public consumption)

You can sign up for Power BI through the Power BI website. You can use Power BI free, or you can buy a Power BI Pro license which includes additional data refresh and collaboration capabilities.

Link to the original article:

by Dejan Pajk Dejan Pajk No Comments

Business Process Model & Notation (BPMN)

Process models describe how businesses conduct their operations. They typically include graphical depictions of at least the activities, events, and control flow logic that constitute a business process. Process models are designed using so-called process modelling languages (sometimes called notations or techniques) i.e., sets of graphical constructs and rules on how to combine these constructs.

Business Process Model and Notation (BPMN) is the world-renowned standard for process modelling and one of the most important components of successful business-IT alignment. The primary goal of BPMN is to provide a notation that is readily understandable by all business users, from the business analysts who create the initial drafts of the processes, to the technical developers responsible for implementing the technology that will perform those processes, and finally, to the business people who will manage and monitor those processes. Thus, BPMN creates a standardized bridge for the gap between the business process design and process implementation.

BPMN provides a notation that is readily understandable by all business users.

BPMN is not owned by an individual enterprise but by an institution the Object Management Group (OMG), already well established through other world-wide standards, e.g., UML. The standard is supported by many software products and widely adopted by modelers and IT vendors.

BPMN’s unique capabilities come from ways in which it differs from traditional flowcharting:

  • Formal Specification – BPMN is based on a formal specification. It has rules that govern the use of each shape, what may connect to what. Thus, a BPMN model can be validated, and many BPMN tools can do that in one click of the mouse
  • Events – An event is “something that happens” while the process is underway. BPMN lets you visualise event-trigger behaviour in the diagram itself (e.g. customer calls to change the order, a system is down, etc.)
  • Collaboration – BPMN describes how the process fits in the global environment. Those communications between the process and external entities like the customer or external service providers, are represented by the dashed connector, called a message flow

BPMN provides businesses with the capability of understanding their internal business procedures in a graphical notation and gives organizations the ability to communicate these procedures in a standard manner. In addition, it provides a standard XML interchange format for the transfer of process and interaction models, and detailed visual information, between modelling tools.

Empowering business people to manage their own processes is critical to the evolution of Business Process Management (BPM), and BPMN provides that – not the executable code, but the precise flow logic that code would have to implement. BPMN can take your BPM initiative all the way to the execution stage of the process life cycle.

Link to the original article:

by Dejan Pajk Dejan Pajk No Comments

The Skills Framework for the Information Age (SFIA)

For professionals working in the fast changing world of Information Technology (IT), the challenge is to maintain up-to-date skills and knowledge while also preparing for career roles that may not yet exist. The Skills Framework for the Information Age navigates this constant technological change and enables individuals to determine their current skills’ profile; identify possible career roles; and recognise development areas to focus on to help achieve career goals.

SFIA Foundation released the sixth version of SFIA on the 1 July 2015. SFIA provides a common reference model in a two-dimensional framework consisting of skills on one axis and seven levels of responsibility on the other. The complete reference guide describes the 97 skills in detail. Each skill comprises an overall definition and descriptions of the levels of responsibility at which the skill might be exercised.

Organisations can use SFIA for overall resource management. The SFIA framework can be incorporated in the job descriptions, profiles of specific roles at various levels, corresponding expected mindsets, and training requirements. During recruitment SFIA helps employers to accurately describe what they need, in a language that potential employees understand. Education bodies, universities, colleges and training providers can map their offerings to SFIA, to ensure the most appropriate courses and certifications are selected for individuals, providing the knowledge they need. Professional bodies and membership organisations can map SFIA to their membership levels, certifications, professional development and mentoring programmes. SFIA is used to identify suitable mentors, supporting knowledge and experience sharing and coaching activities.

On the market there are many software products built on the SFIA framework. One of them is mySFIA, which is available to members of Australian Computer Society (ACS). mySFIA supports the following:

  • Self-assessment (tool guides users through 96 pre-defined skills and ask them to classify individual proficiency to each skill).
  • Career planning (users can select predefined occupation roles, such as Business Systems Analyst; compare their current skills with the skill profile of selected occupation and identify gaps and potential professional development areas to focus on).
  • Accredited assessment (individual’s self-assessed skills profile can be independently validated by the ACS).

For professionals in Dynamics industry, Microsoft Dynamics Sure Step (Sure Step) provides an overview of consulting and customer roles typically involved in implementation projects. Each Sure Step role includes detailed information on the knowledge and expertise needed to perform the tasks (e.g. experience in Web Services). SFIA, on the other hand, doesn’t describe any technology-specific skills or qualifications.

Both frameworks, SFIA and Sure Step, have their own unique advantages. Sure Step is specific and brings value to Dynamics businesses. However, SFIA is applicable cross-industry and provides de-facto global standard.

Link to the original article: