Skip to main content

Electronic Reporting: Send vendor payments to external azure storage via X++

Electronic Reporting module in Microsoft Dynamics 365 Finance Operation lets you archive file generated by ER at SharePoint location and in Azure Storage as per this link Archive ER destination type - Finance & Operations | Dynamics 365 | Microsoft Learn. APIs can be used to check message status and read file from either location. Logic Apps or Power Automate can be used to make a call to APIs, read files, and perform required action. This post is not about how this can be done via integration :) It's been a while I haven't written a full code base post (no low code :))

To send ER generated files directly to your provided Azure Blob Container, below is the sample class.

using Microsoft.Azure;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.File;
using Microsoft.WindowsAzure.Storage.Blob;
using Microsoft.WindowsAzure.Storage.Auth

class DAX_ERVendPaymOutFieUploadHelper
{

    /// <summary>
    /// Handles attachingFile event from Electronic reporting
    /// </summary>
    /// <param name = "_args">Event args for event handler</param>    [SubscribesTo(classStr(ERDocuManagementEvents), staticDelegateStr(ERDocuManagementEvents, attachingFile))]
    
public static void ERDocuManagementEvents_attachingFile(ERDocuManagementAttachingFileEventArgs _args)
    {
        
ERFormatMappingRunJobTable  ERFormatMappingRunJobTable;
        
Common                      common = _args.getOwner();

        if(common.tableid == tableNum(ERFormatMappingRunJobTable))
        {
            ERFormatMappingRunJobTable = 
ERFormatMappingRunJobTable::find(common.RecId);
        }   

        if (!_args.isHandled() && ERFormatMappingRunJobTable.Archived == noyes::No)
        {
            DAX_ERVendPaymOutFieUploadHelper uploadHandler = DAX_ERVendPaymOutFieUploadHelper::construct();

            uploadHandler.uploadFile(_args.getStream());
        }
    }

    /// <summary>
    /// Creates an object of DAX_ERVendPaymOutFieUploadHelper class
    /// </summary>
    /// <returns>DAX_ERVendPaymOutFieUploadHelper class object</returns>

    public static DAX_ERVendPaymOutFieUploadHelper construct()
    {
        return new DAX_ERVendPaymOutFieUploadHelper();
    }

 

    /// <summary>
    /// Uploads file to custom Azure blob container specified in parameters
    /// </summary>
    /// <param name = "_fileStream">File stream to be uploaded</param>
    /// <returns>True if file uploaded successfully</returns>

    private boolean uploadFile(System.IO.Stream _fileStream)
    {
        boolean ret = true; 

        // Custom parameters table to store Azure Storage and container info
        DAX_Parameters parameters = DAX_Parameters::find();

        try
        {
            StorageCredentials credentials = new StorageCredentials(parameters.StorageAccountName, parameters.Key);

            CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, true);

            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();          

            CloudBlobContainer rootContainer = blobClient.GetContainerReference(parameters.ContainerName);          

            if(!rootContainer.Exists(nullnull))
            {
                return Checkfailed('Azure storage parameters are not set up correctly.');
            }

            CloudBlobDirectory directory = rootContainer.GetDirectoryReference(parameters.BankOutPaymFolder);

            CloudBlockBlob blockBlob =                directory.GetBlockBlobReference(strFmt('VendOutPaym.xml'));

             if (_fileStream.CanSeek)
            {
                _fileStream.Seek(0, System.IO.SeekOrigin::Begin);
            }

            blockBlob.UploadFromStream(_fileStream, nullnullnull);
            Info('File uploaded');
        }

        catch(Exception::Error)
        {
            ret = checkFailed('Error occurred while uploading the file');
        }

        catch(Exception::CLRError)
        {
            ret = checkFailed('CLR Error occurred while uploading the file');
        }

        return ret;
    }
}


Comments

  1. QUICK EASY EMERGENCY URGENT LOANS LOAN OFFER EVERYONE APPLY NOW +918929509036 financialserviceoffer876@gmail.com Dr. James Eric

    ReplyDelete

Post a Comment

I will appreciate your comments !

Popular posts from this blog

The Dual Write implementation - Part 1 - Understand and Setup

What is Dual-write? Tightly couples – complete at one transaction level Near real time Bi-directional Master data and business documents – Customer records you are creating and modifying and at this document we are talking about sales orders or quotes and invoice. Master data could be reference data e.g. customer groups and tax information Why Dual-write and why not Data Integrator? Data Integrator is Manual or Scheduled One directional Now, Let's deep dive and understand what is required for Dual-write setup and from where to start. First thing first, check you have access to https://make.powerapps.com/ Choose right environment of CDS (CE) Make sure you have access to the environment too, click on gear icon and Admin Center  Look for required environment and Open it, you must have access as going forward you are going to configure dual write steps in the environment user the same user you are logged in now. Now, go back to power platform admin center and...

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

Scenario: There are times when you want to delete an entity from target entity list and when you do so, you face an error message which does not tell you where exactly the entity has been used.  "Entity cannot be deleted while dependent Entities for the processing group exist. Delete dependent Entities for a processing group and try again. " Solution: Browse the environment by appending this part  /?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF   at the end.  For example; if the environment URL is  https://daxture.sandbox.operations.dynamics.com then the complete URL will be https://daxture.sandbox.operations.dynamics.com/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF Filter for Entity and it will give you the DefinitionGroup where the entity has been added or used in data management import/export projects. Get the DefinitionGroup name and search in the export/import projects under data management and either del...

Dual-write connection set error: An item with the same key has already been added

If you happen to see this error message then you have duplicate records in cdm_company entity in CDS environment. Check for cdm_companycode field this is normally not allowed but have a look and delete the ones with duplicates.