Skip to main content

Calling/Opening AX form through X++

Sample piece of code to open AX form through X++

static void OpenForm_ThroughCode(Args _args)
{
    Args                            args;
    Object                          formRun;

    // open form
    args = new Args();
    args.name(formstr(FormName));
    formRun = classfactory.formRunClass(args);
    formRun.init();
    formRun.run();
    formRun.wait();
}

If you want to pass a record to open a form

args = new Args();
args.record(ProjTable::find('PR00001'));
args.name(formstr(FormName));
formRun = classfactory.formRunClass(args);
formRun.init();
formRun.run();

formRun.wait();

How to retrieve these args on caller form's init()

public void init()
{
    ProjTable   projTableLocal;   
    super();   
    projTableLocal = element.args().record();   
}

Comments

  1. This comment has been removed by the author.

    ReplyDelete
  2. Hello,

    I had used the above code to calling other form from grid. I have grid and open the other form by double clicking on the any grid record. In grid, there are columns ( ServiceID,CustID,CustomerName etc...). My problem is that When I double click on grid, it will open ServiceOrder form(SO NO : So-10001) then again double click on the grid,again open the serviceorder form(SO NO : So-10002). Means two Serviceorder forms will be opened with different SO number like (So-10001 and So-10002) but after that in both ServiceOrder forms, it display the same ServiceId and same details. I don't know why it happend.Can you please guide me? Thanks in advance.

    ReplyDelete

Post a Comment

I will appreciate your comments !

Popular posts from this blog

The Dual Write implementation - Part 1 - Understand and Setup

What is Dual-write? Tightly couples – complete at one transaction level Near real time Bi-directional Master data and business documents – Customer records you are creating and modifying and at this document we are talking about sales orders or quotes and invoice. Master data could be reference data e.g. customer groups and tax information Why Dual-write and why not Data Integrator? Data Integrator is Manual or Scheduled One directional Now, Let's deep dive and understand what is required for Dual-write setup and from where to start. First thing first, check you have access to https://make.powerapps.com/ Choose right environment of CDS (CE) Make sure you have access to the environment too, click on gear icon and Admin Center  Look for required environment and Open it, you must have access as going forward you are going to configure dual write steps in the environment user the same user you are logged in now. Now, go back to power platform admin center and

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

Scenario: There are times when you want to delete an entity from target entity list and when you do so, you face an error message which does not tell you where exactly the entity has been used.  "Entity cannot be deleted while dependent Entities for the processing group exist. Delete dependent Entities for a processing group and try again. " Solution: Browse the environment by appending this part  /?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF   at the end.  For example; if the environment URL is  https://daxture.sandbox.operations.dynamics.com then the complete URL will be https://daxture.sandbox.operations.dynamics.com/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF Filter for Entity and it will give you the DefinitionGroup where the entity has been added or used in data management import/export projects. Get the DefinitionGroup name and search in the export/import projects under data management and either delete the whole

Dual-write connection set error: An item with the same key has already been added

If you happen to see this error message then you have duplicate records in cdm_company entity in CDS environment. Check for cdm_companycode field this is normally not allowed but have a look and delete the ones with duplicates.