Skip to main content

InventTrans table data model changes in AX 2012 vs AX 2009

The purpose of this post is to give an overview about the changes been made in the Dynamics AX 2012 data model related to inventory transactions.

For more details; how to implement code changes for inventTrans table the best resource is this white paper Implementing InventTrans refactoring for Microsoft Dynamics AX Applications AX2012

Before AX2012, the only table used for recording all the inventory transactions was InventTrans.  In AX2009, a lot of data in InventTrans is either redundant or fields are used to represent a specific type of transaction like sales, purchase, transfer etc.
In AX2012, the base is still the same i.e. every inventory still gets recorded in the InventTrans but the only difference is that the table has been more normalized now. A new table has been added called InventTransOrigin which is actually now holding the relationship between the originating tables (transaction tables) and InventTrans. The InventTransId, TransType and TransRefId have now been removed from the InventTrans and moved to the InventTransOrigin table with names InventTransId, ReferenceCategory and ReferenceId respectively.

Every transaction has now its own InventTransOrigin table. For example, in case of PurchLine the table named is InventTransOriginPurchLine which is actually referring to InventTransOrigin through the InventTransId field. Transaction tables like PurchLine, SalesLine, ProdLine etc. still contains InventTransId field but it not recommended to use/refer them. It is expected to be depreciated in future. 






Comments

Popular posts from this blog

The Dual Write implementation - Part 1 - Understand and Setup

What is Dual-write? Tightly couples – complete at one transaction level Near real time Bi-directional Master data and business documents – Customer records you are creating and modifying and at this document we are talking about sales orders or quotes and invoice. Master data could be reference data e.g. customer groups and tax information Why Dual-write and why not Data Integrator? Data Integrator is Manual or Scheduled One directional Now, Let's deep dive and understand what is required for Dual-write setup and from where to start. First thing first, check you have access to https://make.powerapps.com/ Choose right environment of CDS (CE) Make sure you have access to the environment too, click on gear icon and Admin Center  Look for required environment and Open it, you must have access as going forward you are going to configure dual write steps in the environment user the same user you are logged in now. Now, go back to power platform admin center and...

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

Scenario: There are times when you want to delete an entity from target entity list and when you do so, you face an error message which does not tell you where exactly the entity has been used.  "Entity cannot be deleted while dependent Entities for the processing group exist. Delete dependent Entities for a processing group and try again. " Solution: Browse the environment by appending this part  /?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF   at the end.  For example; if the environment URL is  https://daxture.sandbox.operations.dynamics.com then the complete URL will be https://daxture.sandbox.operations.dynamics.com/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF Filter for Entity and it will give you the DefinitionGroup where the entity has been added or used in data management import/export projects. Get the DefinitionGroup name and search in the export/import projects under data management and either del...

Dual-write connection set error: An item with the same key has already been added

If you happen to see this error message then you have duplicate records in cdm_company entity in CDS environment. Check for cdm_companycode field this is normally not allowed but have a look and delete the ones with duplicates.