Microsoft Dynamics NAV

by Dejan Pajk Dejan Pajk No Comments

Automated Testing of Dynamics NAV Enhancements

In last week’s post I discussed the Dynamics NAV test automation suite. Running Microsoft standard automated tests ensures that your modifications and add-on solutions do not interfere with vanilla NAV. However, all non-standard/custom business logic needs to be tested as well.

Traditionally, testing takes place in a test environment using sample customer data. During manual testing the application consultants and key users use prebuilt test scripts to test data validation and new functionality. The goal of automation is to reduce the number of test cases to be run manually—not eliminate manual testing all together.

Creating an automated test is more time consuming and expensive than running it once manually. What to automate, when to automate, or even whether one really needs automation are decisions the testing or development team must make. A good candidate for test automation is a test case that is executed repeatedly, is business critical/high-risk, is difficult to perform manually or is time consuming. Automating unstable features or features that are undergoing change should be avoided. Automated vs manual testing can be decided by a basic cost-benefit analysis. If it takes 5 times as long to build, verify and document an automated test, then after the number of automated test runs exceed 5, testing from that point is essentially free.

Test automation should be seen as a long-term investment in the quality of your systems.

Test automation should be seen as a long-term investment in the quality of your systems. The good news is that we can build custom automated tests on top of the Microsoft test toolkit. There are over 60 standard libraries containing numerous generic and application-specific functions that can be reused when building custom tests.

The following best practices should be followed when designing and developing custom automated tests:

  • Tests should follow the same Given-When-Then style. This approach was developed by Dan North and Chris Matts as part of Behaviour-Driven Development (BDD).
  • Tests should not depend on the data in test database. Data should be created on the fly using random generator functions. Hardcoded values in tests should be avoided.
  • Tests should leave the system in the same state as when the test started. You can use the TransactionModel property in test functions and the TestIsolation property in test runner codeunits to control the transactional behaviour. A common scenario is that each test function starts a separate write transaction and all the changes are rolled back after each test codeunit.
  • Test code should test that code works under successful and failing conditions (positive and negative tests). To test failing conditions, you can use the ASSERTERROR keyword.
  • Test code should be kept separate from the code being tested. That way, you can release the tested code to a production environment without releasing the test code.
  • Tests should not require user intervention. Special handler functions should be created to handle all UI interactions.
  • Code Coverage should be monitored to track the extent to which the application code is covered by tests.

There are third-party GUI capture/replay tools available that track user interactions with the product and build a script from them. It would be great if we had such a tool integrated within standard NAV as this would enable users to record new test scripts without any development required.

Testing is an iterative process. Automated tests should be executed every time an enhancement is made in the application (regression testing). During a typical development it is unacceptable to have to wait hours to get results from running all tests. That’s why the Dynamics NAV Test Tool gives us an option to select automated tests only for modified or selected objects (based on test coverage maps). Running automated tests frequently will ensure that new development doesn’t break any existing functionality. Now, with a combination of Microsoft’s default testing suite, and tests developed specifically for your organisation, new implementations and enhancements can go-live with minimum bugs.

Link to the original article: https://www.fenwicksoftware.com.au/blog/automated-testing-of-dynamics-nav-enhancements

by Dejan Pajk Dejan Pajk No Comments

Dynamics NAV Test Automation Suite

The increased complexity of systems and short product release cycles makes the task of testing challenging. One of the key problems is that testing typically comes late in the project, and traditional testing is performed manually. Testing represents a major cost in every Dynamics NAV implementation project. Test automation has been proposed as one solution to reduce these costs. Test automation suites promise to increase the number of tests we run and the frequency at which we run them.

In this post I will introduce the Microsoft test toolset then next week we will consider automated testing for non-standard/custom business logic, i.e. the code that has been added to provide enhancements and modifications to standard NAV.

With the Dynamics NAV 2016 release, Microsoft has made their test toolset a standard part of the product. It’s an exciting addition to Test Codeunits that have been around since NAV 2009. Microsoft is putting significant effort into an automated testing framework. With every version release and monthly cumulative update, new tests are introduced. In the chart below you can see that the number of automated tests in the last five NAV cumulative updates increased by more than 400.

There are unquestionable benefits of utilising automated testing. In my 10-year history with NAV implementations I’ve noticed that testing is too often seen as a “nice-to-have” instead of a “must have” activity. Both sides, the application consultants and the key users, share the same responsibility to carefully test their solutions. Tight deadlines, budget restrictions and busy daily schedules too often limit testing efforts to only critical scenarios and the rest is left to be discovered when the system is already in production.

Introducing automated testing can change this significantly. Automated tests can be run quickly and repeatedly. For example, it takes less than 3 hours to run over 18,000 tests on my laptop. If one manual test only took 1 minute, it would take over 40 man-days to complete all the tests manually.

“Nothing is holding you back anymore from incorporating this tool in your product development, and even custom projects. It’s only up to you to decide to what extent you are going to do this.”  Luc van Vugt, Microsoft MVP and former Microsoft tester.

The Dynamics NAV test suite includes repeatability, reducing man-hours and improving efficiency. It is a critical feature for continuous delivery of quality software.

But running Microsoft standard automated tests only makes sure that custom modifications and add-on solutions do not interfere with standard NAV. In other words, it confirms that existing functionality is still working as it should. However, all non-standard/custom business logic needs to be tested as well. Next week I’ll discuss the benefits and options for automated testing of these enhancements and modifications.

Link to the original article: https://www.fenwicksoftware.com.au/blog/dynamics-nav-test-automation-suite

by Dejan Pajk Dejan Pajk No Comments

Ten Rules for a Successful Dynamics NAV Data Upgrade

Since I joined Fenwick Software in Oct 2015, I’ve completed five Dynamics NAV upgrade projects. Such projects typically start with the code upgrade or application upgrade which is followed by the data upgrade. In this blog, I’m sharing with you my take on the data upgrade process. Here are the top 10 rules that made my upgrade projects successful:

  1. Follow the official script – The latest data upgrade script can be retrieved from the Microsoft MSDN web sites (e.g. Upgrading the Data to Dynamics NAV 2017).
  2. Understand the steps – Following the right upgrade steps is a necessity, however we also need to have a good understanding of what is happening in NAV, so we can properly verify the result of each step and resolve errors that might occur during the upgrade. Blindly following the steps is a potential risk for the upgrade project.
  3. Take notes – I always start my upgrade by putting all the upgrade steps in OneNote. This allows me to track details against it (e.g. comments, statuses and durations) to add additional custom upgrade steps (e.g. post-upgrade setup) and to share the Notebook with the project team. This might seem overkill for some, however, there are many good reasons for keeping a detailed record. The main idea is to make an upgrade process as repeatable and predictable as possible and to eliminate any surprises that might occur during the cutover.
  4. Measure durations – Measuring the durations of each upgrade step is especially recommended when upgrading a large database. The information gained will not only help us to prepare a precise cutover plan, it is also a key input for upgrade optimisation. NAV offers standard commands (e.g. Get-NAVDataUpgrade) that return the status and duration of all upgrade functions that were invoked during the upgrade process.
  5. Optimise – If the total upgrade time exceeds the customer acceptable downtime window we have to find innovative ways to optimise the upgrade. Strategies range from replacing upgrade functions with T-SQL scrips and deleting obsolete data before the upgrade (e.g. Change Log and Archive tables) to improving server performance by adding memory or faster drives. We encountered an interesting challenge on the latest upgrade project, where the data upgrade took more than 3 days to complete. After measuring durations, we discovered that this was caused by the upgrade function which is renaming and updating User IDs. Simple optimisation cut down the duration of this function from 68 hours to 5 hours.
  6. Perform backups – We recommend that a full database backup be done after every major data upgrade step. Compression of backups should be enabled to increase the speed and save the disk space that will be required. In addition, database recovery model can be changed to Simple to free up log space. With this setup, we lose the ability to do a point in time recovery, therefore the most recent restore point will be the latest complete backup.
  7. Take the latest CU – Let’s say we are upgrading from NAV 2009 to NAV 2017. It is necessary to first upgrade data to NAV 2013 or to NAV 2015. My recommendation here is to upgrade to NAV 2015, simply because it is more mature version (e.g. contains upgrade codeunits and improved schema synchronisation). In addition, we should always take the latest cumulative update (CU) available, because it potentially contains upgrade toolkit bug-fixes.
  8. Use Upgrade Codeunits – Upgrade Codeunits were introduced with NAV 2015. Understanding and utilising upgrade codeunits is by far the most important data upgrade skill. Firstly, upgrade codeunits provide instructions for synchronising schema changes (e.g. to migrate data into upgrade tables) and secondly, upgrade codeunits contain upgrade functions, which migrate data from the upgrade tables into the new locations and also perform other data manipulations.
  9. Use Force only when required – Upgrade codeunits should provide explicit instructions for each table on how to handle data changes during the schema synchronization. These instructions are defined in TableSyncSetup functions. Synchronising the schema with option Force should only be used in rare cases (e.g. when deleting upgrade tables).
  10. Delete obsolete tables – One of the last data upgrade steps is to delete upgrade objects. In addition, we should also delete all old unused tables that were brought across from previous version to keep the upgraded database tidy and clean.

Microsoft used to release a new version of NAV every few years and the best practice for customers was to upgrade every second release. This has dramatically changed since Microsoft moved to annual version releases and monthly cumulative updates. In order to leverage new functionality NAV customers have to re-think upgrades and shift to frequent upgrade and update cycles.

Fortunately, the good news is that upgrading is no longer the time consuming and painful process it was in the past. Microsoft has enhanced the power and flexibility of the tools it provides to help us do upgrades more efficiently. The Fenwick Software team is highly specialised in Dynamics NAV upgrades. We are following the latest upgrade best practices and using automated tools which have saved our clients significant amounts of time and money. Get in touch if you would like more information around NAV 2017 or to get a free quote for your NAV upgrade project.

Link to the original article: https://www.fenwicksoftware.com.au/blog/ten-rules-for-a-successful-dynamics-nav-data-upgrade

by Dejan Pajk Dejan Pajk No Comments

Using Query Object to Improve NAV Performance

The aim of this blog is to introduce Query Object and to provide an overview of different scenarios where queries can be used to improve NAV performance.

Query Object type was introduced back in NAV 2013. It gives us the opportunity to define and call T-SQL SELECT statements directly from NAV. Using queries can bring many performance benefits. Despite that, they are still rarely used.

One of the most common operations in a relational database is joining two or more tables. The join operation has traditionally been done in C/AL by record looping. Queries allow us to produce a dataset that is the result of a join operation between two or more tables.

The main advantages of queries are:

  • One SQL request – Queries allow NAV to retrieve all the data in one request.
  • Support for FlowFields – A sub-query is automatically added to the SQL statement to retrieve each FlowField in a query.
  • Totals and Group by – Queries naturally support totals and grouping of data. What has traditionally been done with temporary tables can now be replaced by simple queries. In addition, NAV Server will automatically use a SIFT indexes to optimise the calculation of totals where possible.
  • Limiting dataset – Queries enable us to select only the columns we need in the result (not the entire record), we can limit the number of rows by specifying filters.

There are some useful examples of how queries can be used in NAV:

  • Query as a page data source – To display the result of a query on NAV page you have to copy query result data in a temporary table, selected as a page source table. This design approach can be seen on the standard NAV Page 9126 “Lot Numbers by Bin FactBox”.
  • Query as a report data source – Reports, similar to pages, cannot natively use a query as a data source. As a workaround, we must use an Integer table to loop through the query dataset and store the result in variables, which are used as a report dataset. Standard NAV Report 19 “VAT- VIES Declaration Tax Auth” is designed by following this approach.
  • Query in C/AL functions – In standard NAV 2017, queries improve performance in more than 40 functions. Queries are used to calculate amounts (e.g. Customer Overdue Amounts on the Customer Statistics page), to detect duplicate records, to replace nested loops (e.g. when matching bank reconciliation lines) etc.
  • Exposing data as an OData web service – You can register and publish a query as a web service in the same way that you can register and publish pages or codeunits. After you expose a query as a web service, you can import it into other applications, such as Power BI.
  • Saving a query as an .xml or .csv file – You can use the SAVEASXML function to create an .xml file that contains the resulting dataset of a query. You can use the .xml file to integrate with external applications.
  • Creating charts – NAV charts can be based on a query instead of a table.

A potential downside of using queries is that NAV does not do any caching for the result sets. When you run a query, NAV always gets the data directly from SQL Server. This limitation should be carefully considered when you design your solutions with queries. As a general recommendation, you should always measure the performance of your solution before and after the change. You can monitor how your application code performs with Dynamics NAV Application Profiler tool. This aside, queries are very useful and can provide many benefits.

Link to the original article: https://www.fenwicksoftware.com.au/blog/using-query-object-to-improve-nav-performance

by Dejan Pajk Dejan Pajk No Comments

Fit/Gap Analysis

ERP systems such as Microsoft Dynamics NAV support the standard processes of an organisation. They integrate best practices that have provided an effective and validated way to perform business operations. To select and successfully implement an ERP system, the capabilities of that system have to be compared with a company’s business needs. Based on a comparison, all of the fits and gaps must be identified and further analysed. This step usually forms part of ERP implementation methodologies and is called fit gap analysis.

Fit gap analysis (also called gap analysis) is an important phase of an ERP selection and implementation. It determines the extent of business process change required for a particular solution as well as software customisation and interfacing requirements. Matching the ERP’s functionality to the way an organisation does business is a vital factor for the success of an ERP implementation.

Fit gap analysis is mentioned twice in ERP system implementation methodologies. High-level fit gap analysis is usually conducted in the pre-implementation or ERP system selection phase and a complete or detailed fit gap analysis is completed during system analysis phase. Fit gap analysis is usually built on a Functional Requirements Document (FRD). FRD provides a detailed list of the functional, non-functional, integration and interface requirements and the affected business process flows.

Matching the ERP’s functionality to the way an organisation does business is a vital factor for the success of an ERP implementation.

The main goal of fit gap analysis is to identify and document all fits and gaps, followed by an analysis of each gap, suggestions of possible solutions and closing of the gaps by selecting the most appropriate option. Possible options are:

  1. Standard ERP system capability: Standard capabilities are met by the system out-of-the-box, without requiring additional effort or configuration time.
  2. ERP system configuration: This entails choosing from among the reference processes and setting the parameters in ERP to reflect organisational features without changing the ERP source code.
  3. Adapting a company’s business processes: Best practices implemented in the software package are applied within an organisation (technology-driven approach).
  4. Adapting an ERP system through customisation: Adaptation of an ERP system is appropriate for those companies that believe their business processes are better than those implemented in an ERP system and do not want to lose their competitive advantage.
  5. Adding a new business solution and/or software vendor: A third-party vendor (independent software vendor, ISV) can provide a vertical or industry solution that supports a customer’s specific needs. For example, the enwis) waste managment solution.
  6. Providing a workaround so that the business can function: Business needs will not be supported and will be documented as out of scope or planned for future releases.

Fit gap analysis holds important consequences for project success. One of the output metrics of the fit gap analysis is the degree of fit. The degree of fit is an indicator of business alignment with the standard ERP system functionality. It is calculated as the sum of all business needs that fit, divided by all business needs. Besides standard ERP system capabilities, the business needs categorised as fit are ERP system configurations and adaptations of a company’s business processes. In addition, each business need should be weighted in terms of its importance (e.g. nice-to-have or critical).

Even though the metric is only an estimate, it provides a high-level overview and understanding of the project risk. A low degree of fit could also lead to a decision to select another ERP system or to redesign business processes to meet the standard system.

Link to the original article: https://www.fenwicksoftware.com.au/blog/fitgap-analysis

Top