Saturday, October 24, 2020

Dual-write connection set error: An item with the same key has already been added

If you happen to see this error message then you have duplicate records in cdm_company entity in CDS environment. Check for cdm_companycode field this is normally not allowed but have a look and delete the ones with duplicates.



Dual-write integration for Cross-Company data in D365 Finance and SCM



Dual-write does not work with the cross company data sharing policies in D365 FinOps (there are so many names but I am using this name for reference 😊).

Brief overview about cross company data sharing policy first to set the base for the readers, the Cross-Company data sharing lets you have your data accessible from multiple legal entities (companies in D365 FinOps). For example, if you setup a policy for vendors to be crossed-company then whenever you create a new vendor it will be created in all data sharing legal entities.

https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/sysadmin/cross-company-data-sharing

Now, What happened when you a table under cross-company data sharing (e.g. VendTable) and want to sync vendors through dual-write?

Since, VendTable is one of the data sources for Vendors data entity (dual-write entity map) and we know dual-write does not work well with cross-company data sharing by design. You get following error message when you try to Run the entity map.

"Copying pre-existing data completed with errors.
For additional details, go to initial sync details tab."




The error message is confusing and does not reflect the actual issue behind the scene - you will never be able to figured it out what is wrong until you raise it with MS dual-write team and share the activity Id of the job with them to investigate the telemetry (you don't have access to check this one πŸ˜’) then they share the root cause.

However, You can also investigate by putting a breakpoint in method validateDataSharingEnabledForEntityTableBeforDualWriteEnable() of class SysDataSharingValidation.


/// <summary>

    /// Validates that cross company data sharing is not enabled when enabling Dual Write.

    /// </summary>

    /// <param name = "_entityName">The name of the entity containing the table being enabled</param>

    /// <param name = "_tableName">Table in entity</param>

    /// <param name = "_dataAreaId">Company info</param>

    [SubscribesTo(classStr(BusinessEventsRegistrationBase), staticdelegatestr(BusinessEventsRegistrationBase, onTableEnabled))]

    public static void validateDataSharingEnabledForEntityTableBeforDualWriteEnable(str _entityName, str _tableName, DataAreaId _dataAreaId)

    {

        SysDataSharingOrganization sysDataSharingOrganizationTable;

        SysDataSharingRuleEnabled sysDataSharingRuleEnabledTable;

 

        select firstonly SharedTableName from sysDataSharingRuleEnabledTable

                where sysDataSharingRuleEnabledTable.SharedTableName == _tableName

                    join DataSharingPolicy, DataSharingCompany from sysDataSharingOrganizationTable

                         where sysDataSharingRuleEnabledTable.DataSharingPolicy == sysDataSharingOrganizationTable.DataSharingPolicy

                &&  sysDataSharingOrganizationTable.DataSharingCompany == _dataAreaId;

 

        if (sysDataSharingRuleEnabledTable)

        {

            throw error(strFmt("@DataSharing:CrossCompanySharingError", _entityName, sysDataSharingOrganizationTable.DataSharingPolicy));

        }

    }


Tuesday, October 20, 2020

D365FO: π™‡π˜Ύπ™Ž π™šπ™­π™₯π™€π™§π™©π™šπ™™ π™—π™–π™˜π™₯π™–π™˜ π™π™žπ™‘π™š π™˜π™€π™£π™©π™–π™žπ™£π™¨ π™˜π™€π™§π™§π™ͺπ™₯π™©π™šπ™™ 𝙙𝙖𝙩𝙖

If you experience below error message when trying to import bacpac file into tier 1 (DEV) environment then use the suggested steps to get over this error and get your database imported.

Importing to database 'AxDB_Daxture' on server '.'.

SqlPackage.exe : *** Error importing database:Could not load package from 'C:\Users\Adminb1c06345af\Downloads\daxturebackup.bacpac'.

At line:13 char:5

+     & $LatestSqlPackage $commandParameters "/p:CommandTimeout=0"

+     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (*** Error impor...backup.bacpac'.:String) [], RemoteException

    + FullyQualifiedErrorId : NativeCommandError

 

File contains corrupted data.


Based on this docs link https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/database/import-database, download the .NET CORE version of SqlPackage.exe.


This is the .zip file that can be extracted to C:\Temp\Sqlpackage-dotnetcore

From there, instead of using the Sqlpackage.exe under C:\Program Files (x86), use the Sqlpackage.exe in C:\Temp\Sqlpackage-dotnetcore.

Command will be:

C:\Temp\Sqlpackage-dotnetcore>SqlPackage.exe /a:import /sf:D:\Exportedbacpac\my.bacpac /tsn:localhost /tdn:<target database name> /p:CommandTimeout=1200


Sunday, October 11, 2020

Tips and Tricks on how to validate your dual-write integration field-by-field level - its easy believe me 😊

After you are done with your entity field mappings for a required entity map e.g. for Vendor Groups, we started to look at in previous post to get the payload for dual-write integration, two records get created in the following tables.

1. DualWriteProjectConfiguration

2. DualWriteProjectFieldConfiguration

Records only get created when you Run the entity map and get deleted once you stopped it.

DualWriteProjectConfiguration table does not hold data for Vendor groups when the entity map is stopped.







DualWriteProjectFieldConfiguration table does not hold data for Vendor groups when the entity map is stopped.





Run the entity map and check data in these tables,



 




1. Both tables link each other using Name field

2. For each entity map (e.g. Vendor groups) there are two records; one for Insert and Update and the other one is for delete which ends with _end

Import information about DualWriteProjectConfiguration table:

1. External URL field holds the CDS environment's URL which is linked with FO in this format https://<CDS environment URL>/api/data/v9.0/msdyn_vendorgroups. This API URL can be browsed to see available data in CDS environment

2. Project partition map field holds list of legal entities linked for dual-write, you can copy this list in notepad++ (or any where) and check all legal entities from there too - this may help you in troubleshooting

3. IsDebugMod field if marked then failure records are stored in DualWriteErrorLog table

4. Filter query expression field if you have defined Finance and Operations apps filter for entity maps it stores here

Import information about DualWriteProjectFieldConfiguration table:

Filter records from this table based on the project name you have retrieved from DualWriteProjectConfiguration table as shown below as an example

1. External lookup urls field holds information how different lookups from CDS tables are linked for this integration.

Trick and Tip: Copy data from this field and paste into Notepad ++, data looks like this







Format JSON and you can review all data easily in JSON viewer








2. Field mapping field holds information on how fields are mapped between both apps (Finance Operations app and Common Data Service)

Trick and Tip: Copy data from this field and paste into Notepad ++, Format JSON and it shows like this

















DualWriteProjectConfigurationEntity data entity is the triggering point to sync data across based on pre-defined sync direction as discussed above.

In case you want to debug outbound call for vendor groups this will be the stack trace.












I hope this may have helped you to understand the tables and classed behind dual-write and troubleshoot your issue at some point. If you still have any question please reach out to me via comments section.


Want to see the inbound and outbound Payload for dual-write integration

Let's continue where we left in first post "What got changed (Technical) in D365 Finance Operations with dual-write" our journey to understand what are the technical changes being done for dual-write framework.

This post we will look at what are highly used tables, entities, and classes involved for dual-write sync. I will only cover few tables, classes, and data entities but complete list of objects can be explored from AOT as shown below.











Business scenario: Create or update Vendor Groups in FO and sync over to CDS and vice versa.

Open Vendor groups screen - Accounts Payable > Vendors > Vendor groups







Out-of-the-box dual-write template is available for vendor groups





These fields are mapped between Finance and Operations apps and Common Data Service for Vendor groups dual-write template.

NOTE: only data will sync between apps only for mapped fields and the integration will also only trigger when we change data for these mapped fields in either side of the integration apps (FO or CDS) provided the sync direction is set to bi-directional (which is set for this scenario).









Requirement: You want to debug the integration (outbound or inbound)

Two important classes:

1. DualWriteSyncInbound

2. DualWriteSyncOutbound

If you want to debug data going out of FO to CDS (Outbound) then put a breakpoint at method WriteEntityRecordToCDS() of class DualWriteSyncOutbound.

This is quite extensive method and also gets a payload going out of FO to CDS, this payload can be watched and changed using all debugging features of D365 Finance and Operations.

Method BuildCDSPayload() being called inside method WriteEntityRecordToCDS() creates complete payload.

Pasting this just for reference purpose, this is subject to change at anytime so always refer to the updated code.

internal str BuildCDSPayload(str cdsLookupUrls, common entityRecord, str fieldMappingJson, ICDSSyncProvider syncProvider)

    {

        var executionMarkerUniqueIdentifier = newGuid();

        DualWriteSyncLogger::LogExecutionMarker('DualWriteOutbound.BuildCDSPayload', true, strFmt('Execution start for record %1 with uniqueId %2', entityRecord.RecId, executionMarkerUniqueIdentifier));

        CDSPayloadGenerator payloadGenerator = syncProvider.CDSPayloadGenerator;

           responseContract.DualWriteProcessingStage = DualWriteProcessingStage::TransformingSourceData;

       

           FieldMappingIterator fieldMappingIterator = FieldMappingIterator::ReadJson(fieldMappingJson);

        if (fieldMappingIterator == null)

        {

            responseContract.AddRecordResponse(ExecutionStatus::Failed, strFmt("@DualWriteLabels:InvalidFieldMapping",fieldMappingJson), '');

            DualWriteSyncLogger::LogSyncError('BuildCDSPayload', '', '', strFmt('Failed to create CDS payload. Error reason %1', responseContract.GetFormattedResponseObject()), DualWriteDirection::Outbound);

        }

        while (fieldMappingIterator.MoveNext())

        {

            FieldMapping fieldMapping = fieldMappingIterator.Current();

            var valueTranforms =  fieldMapping.ValueTransforms;

            var sourceValue = this.FetchSourceFieldDataFromMapping(entityRecord,fieldMapping);

            // If there is a value default value transform then the payload creation is skipped

            // The payload gets added in CDSSyncProvider

            boolean skipPayloadCreation = false;

            if (valueTranforms != null)

            {

                var transformEnum =  valueTranforms.GetEnumerator();

                while (transformEnum.MoveNext())

                {

                    IValueTransformDetails transform = transformEnum.Current;

                    sourceValue = this.ApplyValueTransform(transform, sourceValue);

                   

                    skipPayloadCreation = (syncProvider.GetProviderType() != CDSSyncProviderType::CDSQueueSync &&

                        transform.TransformType == ValueTransformType::Default);

                    if (transform.HasTransformationFailed)

                    {

                        responseContract.AddFailedFieldResponse(fieldMapping.SourceField, strFmt("@DualWriteLabels:FailedTransform", sourceValue, fieldMapping.SourceField, enum2Str(transform.TransformType)), '');

                    }

                }

            }

            if (!strContains(fieldMapping.DestinationField,'.') || fieldMapping.IsSystemGenerated)

            {

                if (!skipPayloadCreation)

                {

                    payloadGenerator.AddAttributeValuePair(fieldMapping.DestinationField,

                            sourceValue,

                            fieldMapping.IsDestinationFieldQuoted,

                            this.FetchSourceFieldTypeCode(entityRecord, fieldMapping));

                }

            }

            syncProvider.AddSourceColumnTransformedValue(syncProvider.GetMappingKey(fieldMapping.SourceField ,fieldMapping.DestinationField) ,sourceValue);

        }

       

        responseContract.DualWriteProcessingStage = DualWriteProcessingStage::ResolvingLookups;

        var cdsPayLoad = syncProvider.BuildCDSPayloadForLookups(cdsLookupUrls, fieldMappingJson);

        DualWriteSyncLogger::LogExecutionMarker('DualWriteOutbound.BuildCDSPayload', false, strFmt('Execution ends for record %1 with uniqueId %2', entityRecord.RecId, executionMarkerUniqueIdentifier));

        return cdsPayLoad;

    }


To debug data coming in FO from CDS (Inbound) then put a breakpoint at method WriteDataToEntity() of class DualWriteSyncInbound.

Pasting this just for reference purpose, this is subject to change at anytime so always refer to the updated code.

private ResponseContract WriteDataToEntity(str entityName, str entityFieldValuesJson, str companyContext, boolean runValidations = false, boolean isDelete = false, DualWriteTransactionId transactionId = '', str CDSSyncVersion = '', boolean isBatchCommit = true)

    {

        var executionMarkerUniqueIdentifier = newGuid();             

        this.InitializeInboundSync(entityName);        

        return reponseContract;

    }


How to enable new Microsoft teams - Public Preview!

New Microsoft Teams is just AWESOME, quick but useful post below shows how you have this preview feature to make your life EASY!  Open Micr...