This repository contains steps that proved to make the data upgrade process faster in production. The main goal was to be able to start testing the upgraded database as soon as possible to make a go-live or no-go-live decision before starting a working week.
If starting fresh with only open balances is an option for you, please do that by all means. Otherwise, I hope you find some useful ideas here.
The main idea of the upgrade performance improvement is moving some long running C/AL functions to T-SQL and disabling table indexes until the last moment - creating in SQL directly first while the system can be used for the final pre-go-live testing (so called 'soft launch'). We also moved the document Dimension tables out of the toolkit radar by moving the data to other tables and doing the final processing in SQL.
The steps and scripts are provided as is, without guarantees, and should be referred as an example. Please ensure you have a team member that is familiar with the MS SQL Server and T-SQL if you choose to follow the steps. There probably will be some extra steps specific to your company.
- Python 3 installed
- NAV 2009 R2 classic client
- NAV 2013 Dev client installed
- NAV 2015 Dev Client and service installed
- Dynamics 365 Business Central (v. 14) Dev, service and clients installed
- SQL Server installed
- SQL Management Studio installed for running the T-SQL scripts
- Take the production NAV 2009 database offline so that no one could access it. We should be able to take it online and continue working in case upgrade doesn't go as expected.
- Take the production database backup (or create volumes snapshots).
- Restore the database (from the snapshots) on the upgrade environment and attach the database.
- Ensure the log volume is twice as big as the data database file(s).
- If the database is bigger than 1 TB, ensure there is another log file created to exceed 2TB limitation for logs.
- Ensure the compatibility level of the NAV 2009 database is
SQL Server 2008 (100)
. - Ensure recovery mode is Simple.
- Using NAV 2009 R2 Classic client and developer license, open the company.
- Check that all objects are not locked in the environment and force unlock all if any
- Export tables filtered with
..2000000000
byID
into a TXT file - Copy the entire scripts folder, using CLI, run script to disable indexes disable_indexes.py specifying the exported text file as
--original
(-o
). - Upload the result file into the database and compile tables to drop the SQL indexes on the SQL server side.
- Export tables filtered with
<50000
from the database intoNAV_2009_Standard_Customized_Tables.txt
(for instance, to be clear) for a step later. - Run upgrade6080-step1.1.sql against the database.
- Import Upgrade601800.1.MODIFIED.fob replacing the existing tables (no schema changes, code on all triggers removed).
- Export permissions
- Run Dataport
104001 - Import/Export Roles
- Run Dataport
104002 - Import/Export Permissions
- Run Dataport
- In Object Designer, run table 2000000004 User Role and delete all the records with the filter
<>SUPER
by Role ID. - In Object Designer, menu Tools Security->Windows Logins, add your user and role SUPER.
- Run upgrade6080-step1.0.sql against the database.
- Open Form
104001 Upgrade - Old Version
, press Transfer data and Delete Objects.
- Open NAV 2009 DB with NAV 2013 Development Environment to convert the database (takes long, with indexes enabled - ages).
- Make a backup in the fastest way (copy database files).
- Open DB with NAV 2015 as Administrator.
- Compile tables 2000000004…2000000130 range with the synchronize schema
Later
. - Run upgrade6080-step1.2.sql against the database.
- Upload the development licence.
- Make sure the user that runs the NAV 2015 service
DynamicsNAV80
has thedb_owner
access to the DB. - Ensure Microsoft Dynamics NAV Server (NAV 2015) instance points to the database.
- Ensure the NAV 2015 instance's
SQLCommandTimeout
is set to 23:00:00. - Start the NAV 2015 Service.
- Run the Administration Shell as Administrator and execute command
Sync-NavTenant –ServerInstance DynamicsNAV80 -Mode Sync
to initialize database and synchronize all tables with validation. - Import all Microsoft Dynamics NAV 2015 objects file from the Cronus DB with Replace All and Sync Schema
Later
. - Run add_custom_fields_without_code.py to add custom fields without code to NAV 2015 tables (so that we keep the custom and regional data when tables synced) with the following parameters (the date format in objects is dd/mm/yy, might differ for your region):
- NAV_2009_Standard_Tables.txt as
-o
(--original
) - Exported earlier
NAV_2009_Standard_Customized_Tables.txt
as-m
(--modified
) - NAV_2015_Standard_Tables.txt as
-t
(--target
) NAV_2015_Standard_Customized_Tables.txt
as-r
(--result
)
- NAV_2009_Standard_Tables.txt as
- Run the index disabler disable_indexes.py specifying the result text file (
NAV_2015_Standard_Customized_Tables.txt
) as--original
(-o
). - Import the file generated
NAV_2015_Standard_Customized_Tables_Disabled_Indexes.txt
. - Export tables filtered with
50000..99000000
into aCustomTables.txt
and use it as-o
to run script to remove code remove_code.py to be able to compile them. - Import Upgrade601800.MODIFIED.fob with Sync Schema
Now - with Validation
and compile the UPG objects. - Compile all tables with Synchronize Schema
Later
. - Run the Administration Shell as Administrator and execute command
Sync-NavTenant –ServerInstance DynamicsNAV80 -Mode Sync
to initialize database and synchronize all tables with validation. - Review the errors if any.
- Run
Sync-NavTenant –ServerInstance DynamicsNAV80 -Mode Force
(takes long). - Start the upgrade process by selecting from the menu: Tools-> Data Upgrade-> Start
- Run the code moved to SQL upgrade6080-step2.0.sql against the database.
- Drop all upgrade objects (filtered with
104000..104999&<>104087&<>104088
to update the users later) withForce
. - Make a backup of the database or snapshot.
- Upload the license.
- Delete all objects except tables (do NOT synchronize schema)
- Clear Dynamics NAV Server instance and debugger breakpoint records from old database with the script 1. Run script clear-instances-and-breakpoints.sql.
- Open Dynamics NAV Development Environment for Business Central as administrator and convert the 2015 database.
- Import latest merged and prepared Business Central application objects filtered with
..2000000000
by ID. - Import upgrade toolkit objects Upgrade8001300.AU.MODIFIED.fob with Synchronization Schema
Later
. - Make sure the user that runs the BC service
BC140
has thedb_owner
access to the DB. - Ensure Microsoft Dynamics NAV Server
BC140
instance points to the database. - Ensure the BC instance's
SQLCommandTimeout
is set to 23:00:00. - Start the
BC140
service, make sure it's running after a minute. - Export large tables (for instance, filtered by ID with
17|21|32|45|46|110|111|112|113|114|115|121|254|271|339|379|5772|5773|5802|5823|6651|6661|7313|7318|7319|7322|7323|28161|37046301
) into aLarge_Tables_BC.txt
and use it as-o
to run script disable_indexes.py and import the file generatedLarge_Tables_BC_Disabled_Indexes.txt
. - Following the List of Tables and Indexes Enabled, enable ONLY the
SIFT
indexes on the tables in the Development Environment saving with Synchronization SchemaLater
and leaving the rest of the vital indexes to be created later using the script below. - Compile all objects in the BC DB with Synchronization Schema
Later
. - Run the Business Central Administration Shell as Administrator.
- Execute command
Sync-NavTenant –ServerInstance BC140 -Mode Sync
, review errors if any. - Execute command
Sync-NavTenant –ServerInstance BC140 -Mode Force
as we should have all fields that we need in BC objects. - Run the first importance index creation script indexes-large-tables-first-importance.sql and without waiting for completion, proceed to the next steps while it's running in the background. The vital indexes listed in List of Tables and Indexes Enabled.
- Then run
Start-NavDataUpgrade -ServerInstance BC140
. - Run script stored-procedure-GetDimensionSetID.sql.
- Run script upgrade-open-document-dimensions.sql.
- Run script upgrade-journal-line-dimensions.sql.
- Run script upgrade-product-groups-and-open-docs.sql.
- Run script to migrate data migrate-data.sql.
- Restart the instances.
- Generate symbols. Run using command prompt as administrator the following commands:
cd "C:\Program Files (x86)\Microsoft Dynamics 365 Business Central\140\RoleTailored Client"
finsql.exe command=generatesymbolreference, database=<BCDatabase>, serverName=<BCDatabaseSQLServerName>
BC is functional at this point and can be tested at a certain extent. There are no history dimensions processed yet. Make sure users are aware of the situation when the test.
- Run script stored-procedure-GetDimensionSetIDByValueIDSet.sql.
- Run the following scripts concurrently to process
Ledger Entry Dimensions
. This should be not interfere with the systems setup but might decrease performance in general.- upgrade-dimensions-GLEntry.sql.
- upgrade-dimensions-ValueEntry.sql.
- upgrade-dimensions-ItemLedgerEntry.sql.
- upgrade-dimensions-CustomerLedgerEntry.sql.
- upgrade-dimensions-VendorLedgerEntry.sql.
- upgrade-dimensions-BankAccountLedgerEntry.sql.
- upgrade-dimensions-PhysInventoryLedgerEntry.sql.
- upgrade-dimensions-FALedgerEntry.sql.
- Run the indexes-large-tables-second-importance.sql script to build secondary importance indexes for Entry tables and leave it running until done.
After having database running, import prepared Permission, Permission Sets, User Group, User Group Permission Set, Profile, Profile Metadata.
- Run script upgrade-product-groups-historical-data.sql.
- Run script upgrade-posted-document-dimensions.sql.
- Enable indexes on the tables from the List of Tables and Indexes Enabled gradually (estimate by testing table by table first) by planning and deploying BC table objects with indexes and syncing them.
Converting a Database to Business Central - Technical Upgrade