Data management overview - Finance & Operations | Dynamics 365 (2023)

  • Article
  • 18 minutes to read


The functionality noted in this article is currently available in both the stand-alone Dynamics 365 Human Resources and the merged Finance infrastructure. Navigation might be different than noted while we make updates. If you need to find a specific page, you can use Search.

This article describes how you can use the data management framework to manage data entities and data entity packages in finance and operations.

The data management framework consists of the following concepts:

  • Data entities - A data entity is a conceptual abstraction and encapsulation of one or more underlying tables. A data entity represents a common data concept or functionality, for example, Customers or Vendors. Data entities are intended to be easily understood by users familiar with business concepts. After data entities are created, you can reuse them through the Excel Add-in, use them to define import/export packages, or use them for integrations.
  • Data project - A project that contains configured data entities, which include mapping and default processing options.
  • Data job - A job that contains an execution instance of the data project, uploaded files, schedule (recurrence), and processing options.
  • Job history - Histories of source to staging and staging to target jobs.
  • Data package - A single compressed file that contains a data project manifest and data files. This is generated from a data job and used for import or export of multiple files with the manifest.

The data management framework supports using data entities in the following core data management scenarios:

  • Data migration
  • Set up and copy configurations
  • Integration

Data entities

Data entities provide conceptual abstraction and encapsulation of underlying table schema that represent data concepts and functionalities. In Microsoft Dynamics AX 2012, most tables, like the Customer and Vendor tables, were de-normalized and split into multiple tables. This was beneficial from a database design point of view, but made it difficult for implementers and ISV's to use without a thorough understanding of the physical schema. Data entities were introduced as part of data management to be used as a layer of abstraction to easily understand by using business concepts. In previous versions there were multiple ways to manage data, such as Microsoft Excel Add-ins, AIF, and DIXF. The concept of data entities combines those different concepts into one. After data entities are created, you should be able to reuse them for an Excel Add-ins, import/export, or integration. The following table shows core data management scenarios.

Data Migration
  • Migrate reference, master, and document data from legacy or external systems.
Setup and copy configuration
  • Copy configuration between company/environments.
  • Configure processes or modules using the Lifecycle Services (LCS) environment.
  • Real-time service based integration.
  • Asynchronous integration.

Data migration

Using the data management framework, you can quickly migrate reference, master, and document data from legacy or external systems. The framework is intended to help you quickly migrate data by using the following features:

  • You can select only the entities you need to migrate.
  • If an import error occurs, you can skip selected records and choose to proceed with the import using only the good data, opting to then fix and import the bad data later. You will be able to partially continue and use errors to quickly find bad data.
  • You can move data entities straight from one system to another, without having to go through Excel, or XML.
  • Data imports can be easily scheduled using a batch, which offers flexibility when it is required to run. For example, you can migrate customer groups, customers, vendors, and other data entities in the system at any time.

Set up and copy configuration

You can use the data management framework to copy configurations between companies or environments, and configure processes or modules using Microsoft Dynamics Lifecycle Services (LCS).

Copying configurations is intended to make it easier to start a new implementation, even if your team doesn't deeply understand the structure of data that needs to be entered, or data dependencies, or which sequence to add data to an implementation.

The data management framework allows you to:

  • Move data between two similar systems
  • Discover entities and dependencies between entities for a given business process or module
  • Maintain a reusable library of data templates and datasets
  • Use data packages to create incremental data entities. Data entities can be sequenced inside the packages. You can name data packages, which can be easily identifiable during import or export. When building data packages, data entities can be mapped to staging tables in grids or by using a visual mapping tool. You can also drag-and-drop columns manually.
  • View data during imports, so you can compare data, and ensure that it is valid.

Working with data entities

The following sections provide quick snapshots of the different functionalities of data management using data entities. The goal is to help to you strategize and make effective decisions on how to best utilize the available tools during data migration. You will also find tips and tricks on how to effectively use each area during data migration. A list of available data entities for each area can also be found with the suggested data sequences, showing data dependencies. Microsoft provides data packages that can be found on Lifecycle Services (LCS) as an initial guide. The information in this document can be used as a guide for creating your own packages. The description of each data entity shows what the object contains and if it is needed during data migration.


There are two types of sequencing that should be considered when working with data entities.

  • Sequencing data entities within a data package
  • Sequencing the order of data package imports

Sequence data entities within a data package

  1. When a user adds data entities to a data project, by default, a sequence is set for the order in which the entities will load. The first entity added to the project will be set as the first entity to load, the next entity added will be second, the next entity will be third, and so on.

    (Video) Session-1: Data Management Introduction | Data Management Series | D365 for Operations | D365Zone

    For example, if a user added two entities in this order, Sales tax codes and Sales Tax groups, then Sales tax codes is assigned an entity sequence of 1.1.1, and Sales tax groups is assigned an entity sequence of 1.1.2. The sequence level indicates that the second entity will not start the import process until the first level is finished.

  2. To view or edit a sequence, click the Entity sequence button on the Action Pane of the data project.

    Data management overview - Finance & Operations | Dynamics 365 (1)

  3. In the Definition group entity sequence, you can see the execution units and the sequence. You can change sequence by selecting the data entity in the list, setting a different Execution unit or Sequence in level, and then clicking Update selected. After clicking Update selected, the entity will move up or down in the entity list.


The following screenshot shows the entity sequence that is set for the Sales Tax CodeGroups data package.

Data management overview - Finance & Operations | Dynamics 365 (2)

In order to successfully import sales tax codes and groups, the sales tax codes and details have to be loaded first, before sales tax groups can be imported. Sales tax codes and groups are all in Execution unit = 1, but the sequences are in the order that they will be imported. Other related sales tax entities that are not dependent upon other data entities being loaded are included in the package. For example, sales tax exempt numbers is set in its own Execution unit = 2. This data entity will start loading immediately because there are no dependencies on other entities loading before it.

Sequence data package imports

In order to successfully load data, it's important to set the correct order for importing data packages, because of dependencies that exist within and across modules. The numbering format that has been created for the data packages within LCS are as follows:

  • First segment: Module
  • Second segment: Data type (setup, master, transaction)
  • Third segment: Sequence number

The following tables provide more information about the default numbering format.

Module numbers

Data type numbers

Sequence number

(Video) EP07 Data Management Framework for D365 FinOps

Data packages follow the sequence number, followed by the module abbreviation, and then a description. The following example shows General ledger data packages.


When working with data entities, mapping an entity to a source is automatic. The automatic mapping of fields can be overridden if needed.

View mapping

To view how an entity is mapped, locate the tile for the entity in the project, and then click View map.

We provide mapping visualization view (default) and mapping details view. A red asterisk (*) identifies any required fields in an entity. These fields must be mapped in order to work with the entity. Other fields can be unmapped as required when working with the entity.

  • To unmap a field, highlight the field in either column (Entity or Source), click Delete selection, and then click Save. After saving, close the form to return to the project.

The field mapping from source to staging can also be edited after import using the same process.

Regenerate a map

If you have extended an entity (added fields) or if the automatic mapping appears to be incorrect, the mapping of the entity can be regenerated in the Mapping form.

  1. To do this, click Generate source mapping.

    A message will display asking, "Do you want to generate the mapping from scratch ?"

  2. Click Yes to regenerate the mapping.

Generate data

If you have fields in entities that you want the system to generate data for on import, instead of providing the data in the source file, you can use the auto-generated functionality in the mapping details for the entity. For example, if you want to import customers and customer address information, but the address information was not previously imported with the Global Address Book entities, you can have the entity auto-generate the party number upon import and the GAB information will be created. To access this functionality, view the map of the entity and click the Mapping details tab. Select the fields that you want to auto-generate. This will change the source field to Auto.

Turn off automatically generated number sequences

Many entities support automatic generation of identifiers based on number sequence setup. For example, when creating a product, the product number is automatically generated and the form does not allow you to edit values manually.

It is possible to enable manual assignment of number sequences for a specific entity.

(Video) Dynamics 365 Finance & Operations Data Migration and Data Management Best Practices

After you have enabled manual assignment, you can provide manually assigned numbers instead.


Export is the process of retrieving data from a system using data entities. The export process is done through a project. When exporting, you have a lot of flexibility as to how the export project is defined. You can choose which data entities to export, but also the number of entities, the file format used (there are 14 different formats to choose for export), and apply a filter to each entity to limit what is exported. After the data entities have been pulled into the project, the sequencing and mapping described earlier can be performed for each export project.

After the project is created and saved you can export the project to create a job. During the export process, you can see a graphical view of the status of the job and the record count. This view shows multiple records so you can review the status of each record prior to downloading the actual files.

After the job is completed you can choose how to download the files: each data entity can be a separate file, or by combining the files into a package. If there are multiple data entities in the job, choosing the package option will speed up the upload process. The package is a zip file, containing a data file for each entity as well as a package header and manifest. These additional documents are used when importing in order to add the data files to the correct data entities and sequence the import process.


Import is the process of pulling data into a system using data entities. The import process is done through the Import tile in the Data Management workspace. Data can be imported either for individual entities or for a group of logically related entities that are sequenced in the correct order. The file formats vary depending on the type of import. For an entity, it can be an Excel file that is comma-separated, tab-separated, or text. For a data package, it is a .zip file. In both cases, the files are exported using the above mentioned export process.

Import a data package

  1. Log into the environment using a login with sufficient privileges (typically this is the Administrator role).

  2. On the dashboard, click the Data Management workspace.

  3. Click the Import tile.

  4. On the next page, do the following:

    1. Provide a name.

    2. In the Source Data Format field, select Package.

      (Video) Dynamics 365 Finance, Supply Chain Management Data Management

    3. Click the Upload button and choose the appropriate package file from the location for the data being imported. This will import all the files from the package.

    4. Click Save, and then click Import.

Import multiple data packages

Use one of the following methods to import multiple data packages.

  • Create a new job for each package, and then repeat steps 4(a) through 4(d) above, for each package.

  • Create one job to import multiple packages in a sequence. Repeat steps 4(a) through 4(c) above, and then repeat step 4(c) for all packages that need to be imported. After you select the packages, execute step 4(d) to import the data from the selected data packages through a single job.

After you click Import, the data will be imported through staging tables. The progress of the import can be tracked using the Refresh button in the upper-right corner of the screen.

Troubleshoot data package processing

This section provides troubleshooting information for the different stages of data package processing.

  • Status and error details of a scheduled job can be found under the Job history section in the Data management form.
  • Status and error details of previous runs for data entities can be displayed by selecting a data project and clicking Job history. In the Execution history form, select a job, and click View staging data and View execution log. The previous runs include data project runs that were executed as batch jobs or manually.

Export process troubleshooting

  • If you get an error during the export process, click View execution log and review the log text, staging log details, and Infolog for more information.
  • If you get an error during the export process with a note directing you to not skip staging, turn off the Skip staging option, and then add the entity. If you are exporting multiple data entities, you can use the Skip staging button for individual data entities.

Import process troubleshooting

When uploading data entity files:

  • If data entities do not display in Selected files and entities after you click Upload during the import process, wait a few minutes, and then check whether the OLEDB driver is still installed. If not, then reinstall the OLEDB driver. The driver is Microsoft Access Database Engine 2010 Redistributable – AccessDatabaseEngine_x64.exe.
  • If data entities display in Selected Files and Entities with a warning after you click Upload during the import process, verify and fix the mapping of individual data entities by clicking View map. Update the mapping and click Save for each data entity.

During data entity import:

  • If data entities fail (shown with a red X or yellow triangle icon on the data entity tile) after you click Import, click View staging data on each tile under the Execution summary page to review the errors. Sort and scroll through the records with Transfer status = Error to display the errors in the Message section. Download the staging table. Fix a record (or all records) directly in staging by clicking Edit, Validate all, and Copy data to target, or fix the import file (not staging file) and reimport the data.
  • If data entities fail (shown with a red x or yellow triangle icon on the data entity tile) after you click Import, and View staging data shows no data, go back to the Execution summary page. Go to View execution log, select the data entity, and review the Log text, Staging log details, and Infolog for more information. Staging log details will display Error column (field) details and Log description will describe errors in detail.
  • If data entities fail, you can check the import file to see if there's an extra line in the file with text which displays, "This is a string that is inserted into Excel as a dummy cell to make the column to support more than 255 characters. By default, an Excel destination component will not support more than 255 characters. The default type of Excel will be set based on the first few rows". This line is added during data export. If this line exists, delete this line, re-package the data entity, and try to import.

Features flighted in data management and enabling flighted features

The following features are enabled via flighting. Flighting is a concept that allows a feature to be ON or OFF by default.

Flight nameDescription
DMFEnableAllCompanyExportEnables BYOD export from all companies in the same export job (supported for BYOD only and not files). By default, this is OFF. This flight is no longer needed after Platform update 27 because this feature can be turned ON using a parameter in data management framework parameters.
DMFExportToPackageForceSyncEnables synchronous execution of data package API export. By default, it's asynchronous.
EntityNamesInPascalCaseInXMLFilesEnables behavior where entity names are in Pascal Case in the XML files for entities. By default, the names are in upper case.
DMFByodMissingDeleteEnables the old behavior where under certain conditions, certain delete operations were not synced to BYOD using change tracking.
DMFDisableExportFieldsMappingCacheDisables caching logic when building target field mapping.
EnableAttachmentForPackageApiEnables attachments functionality in the package API.
FailErrorOnBatchForExportEnables fail on error at execution unit or level for export jobs.
IgnorePreventUploadWhenZeroRecordDisables 'prevent upload when zero records' functionality.
DMFInsertStagingLogToContainerBy default this is ON. This improves performance and functional issues with error logs in the staging table.
ExportWhileDataEntityListIsBeingRefreshedWhen enabled, additional validations are made on mappings when a job is scheduled while entity refresh is in progress. By default, this is OFF.
DMFDisableXSLTTransformationForCompositeEntityThis can disable the application of transformations on composite entities.
DMFDisableInputFileCheckInPackageImportAdditional validations are made to ensure if any entity file is missing from a data package, error message is shown. This is the default behavior. If required, this can be turned OFF by this flight.
FillEmptyXMLFileWhenExportingCompositeEntityPrior to Platform update 15, when exporting composite entities that did not have any records to export, the XML file generated did not have any schema elements. This behavior can be changed to output empty schema by enabling this flight. By default, the behavior will still be to output empty schema.
EnableNewNamingForPackageAPIExportA fix was made to ensure unique names are used for the execution ID when ExportToPackage is used for export scenarios. Duplicate execution ID's were being created when ExportToPackage was called in quick succession. To preserve compatibility, this behavior is OFF by default. Turning this flight ON will enable this new behavior where new naming convention for execution ID's will ensure unique names.
DMFDisableDoubleByteCharacterExportA fix was made to ensure that data can be exported when the format is configured to use code page 932 setting. If an issue is encountered in relation to double byte exports, this fix can be turned off by disabling this flight to unblock, if applicable.
DisablePendingRecordFromJobStatusA fix was made to ensure that pending records are taken into consideration while evaluating the final status of an import job. If implementations have a dependency on the status evaluation logic and this change is considered a breaking change for an implementation, this new logic can be disabled using this flight.
DMFDisableEnumFieldDefaultValueMappingA fix was made to ensure that default values set in advanced mapping for enum fields are successfully saved in the data package manifest file when generating the data package. This makes it possible for the data package to be used as a template for integrations when such advanced mappings are used. This fix is protected by this flight and can be disabled if the previous behavior is still needed (which is to always set the value to 0 in the data package manifest).
DMFXsltEnableScriptThis flight only applies to Platform update 34 and non-production environments. A fix was made in Platform update 34 to prevent scripting in XSLT. However, this resulted in breaking some functionality that was dependent on scripting. As a result, this flight has been turned ON by Microsoft in all production environments as a preventive measure. In non-production environments, this must be added by customers if they encounter XSLT failures related to scripting. From Platform update 35 onward, a code change was made to revert the Platform update 34 change so this flight does not apply from Platform update 35 onward. Even if you enabled this flight in Platform update 34, upgrading to Platform update 35 will not cause any negative impact due to this flight being ON from Platform update 34.
DMFExecuteSSISInProcThis flight is OFF by default. This is related to a code fix that was made to run SQL Server Integration Services (SSIS) out of in-process to optimize memory utilization of SSIS when running DIXF jobs. However, this change has caused a regression in a scenario where if the DIXF data project name has an apostrophe (') in it, then the job will fail with an error. If you encounter this issue, removing the (') in the data project name will resolve the failure. However, if for some reason the name cannot be changed, then this flight can be enabled to overcome this error. Enabling this flight will make SSIS run in-process as before, which could lead to higher memory consumption when running DIXF jobs.

The following steps enable a flight in a Tier-1 environment. Execute the following SQL command.

For enabling flights in a production or sandbox environment, a support case must be logged with Microsoft.

  • After running the SQL statement, ensure that the following is set in the web.config file on each of the AOS's.add key="DataAccess.FlightingServiceCatalogID" value="12719367"

  • After making the above change, perform an IISReset on all AOS's.

    (Video) How to Import Products in FinOps using data management.

  • Partition - Partition ID from the environment, which can be obtained by querying (select) for any record. Every record will have a partition ID that must be copied and used here.

  • RecID - Same ID as partition. However, if multiple flights are enabled, then this can be partition ID + 'n' to ensure it has a unique value

  • RecVersion = 1

Additional resources

  • Data entities overview


How does Data management work in Dynamics 365 for finance and operations? ›

You can export or import data in finance and operations apps by using the Data management workspace. Validate the data by staging the source data, and then move it to the target tables in the finance and operations apps database.

What is included in dynamics 365 finance and operations? ›

Microsoft Dynamics 365 Finance & Operations

Dynamics 365 Finance: Includes budgeting, project management, financials, and accounting for large, international companies. Dynamics 365 Supply Chain Management: Includes engineering, manufacturing, warehousing, and distribution.

What is Data management in Dynamics 365? ›

The data management framework allows you to: Move data between two similar systems. Discover entities and dependencies between entities for a given business process or module.

What is DMF in d365fo? ›

Data Management Framework (DMF) entities will be available for benefits management. The entities will enable setup and enrollment scenarios and include the following: Entity support for setup entities. Entity support for transaction-level entities.

What are the 5 data management functions? ›

It is often referred to by its acronym, DBMS. The functions of a DBMS include concurrency, security, backup and recovery, integrity and data descriptions.

Does Dynamics 365 finance and operations use Dataverse? ›

Get your Finance and Operations apps data in Microsoft Dataverse and keep it up to date. We are making the dual-write framework a seamless experience by harmonizing the converging concepts between Finance and Operations apps and customer engagement apps in Dynamics 365.

What are the 3 main processes of data management? ›

MDM helps ensure businesses don't use multiple, potentially inconsistent versions of data in different parts of business, including processes, operations, and analytics and reporting. The three key pillars to effective MDM include: data consolidation, data governance, and data quality management.

What are the 3 data management approaches? ›

Three approaches to implement data governance include 1) the Command and Control approach, 2) the Traditional approach, and 3) the Non-Invasive approach. This article compares and contrasts the approaches and quickly summarizes each approach.

Why is DMF needed? ›

A Drug Master File (DMF) is a submission to the Food and Drug Administration (FDA) that may be used to provide confidential detailed information about facilities, processes, or articles used in the manufacturing, processing, packaging, and storing of one or more human drugs.

What is the main objective of DMF? ›

The primary goal of a DMF is to work for the welfare of those areas and individuals that have been impacted by mining, as per the state's prescription. The State Government believes that tribals and other poor residents also have the right to benefit from the extraction of natural resources in their localities.

What does MRP stand for in D365? ›

Planning Optimization support for materials requirements planning (MRP) - Dynamics 365 Release Plan | Microsoft Learn.

Is Dynamics 365 a CRM or ERP? ›

Microsoft Dynamics 365 is a cloud-based business applications platform that combines components of customer relationship management (CRM) and enterprise resource planning (ERP), along with productivity applications and artificial intelligence tools.

Is Dynamics 365 the same as CRM? ›

In 2016, Microsoft replaced Dynamics CRM with Dynamics 365, a fully rebranded product specifically designed for customer engagement. Dynamics 365 is a combined CRM and ERP product that includes full Dynamics AX suite for ERP and the Business Edition includes the financial suite built from Dynamics NAV.

What are the 5 A's of data? ›

In order to understand at what point 'data' transitions into being 'big data', and what its key elements are, it is imperative that we study the 5 Vs associated with it: Velocity, Volume, Value, Variety, and Veracity.

What is the difference between Dataverse and Dynamics 365? ›

In short, Dataverse is a ready-to-use server that offers a security layer, a business layer, a data access layer and so on. Dynamics CRM solutions store their data on a Dynamics server, the business logic is implemented via plugins on Dataverse.

Is Dataverse a SQL database? ›

The Microsoft Dataverse business layer provides a Tabular Data Stream (TDS) endpoint that emulates a SQL data connection. The SQL connection provides read-only access to the table data of the target Dataverse environment thereby allowing you to execute SQL queries against the Dataverse data tables.

What is an example of data management? ›

Using a data management platform provides you with control over your data for multiple use cases. For example, a data management platform could collect customer data from multiple sources, then analyze and organize it to segment your customers by purchase history.

What are the 5 stages of data processing cycle? ›

The raw data is collected, filtered, sorted, processed, analyzed, stored, and then presented in a readable format.

Is SQL a data management system? ›

Developed by Microsoft, SQL Server is a relational database management system built on top of structured query language (SQL), a standardized programming language that allows database administrators to manage databases and query data.

Is Excel a database management system? ›

As a spreadsheet program, Excel can store large amounts of data in workbooks that contain one or more worksheets. However, instead of serving as a database management system, such as Access, Excel is optimized for data analysis and calculation.

What are the two main data management factors? ›

Companies need a coherent strategy that strikes the proper balance between two types of data management: defensive, such as security and governance, and offensive, such as predictive analytics.

What is the purpose of data management? ›

Data management helps minimize potential errors by establishing processes and policies for usage and building trust in the data being used to make decisions across your organization. With reliable, up-to-date data, companies can respond more efficiently to market changes and customer needs.

What is data management in Excel? ›

Data management techniques allow to tidy up datasets and make them ready for statistical analyses. Manage your data in Excel using the XLSTAT software.

What are the four 4 major levels of data organization? ›

A variable has one of four different levels of measurement: Nominal, Ordinal, Interval, or Ratio.

How is data processing used in finance? ›

Data analytics helps finance teams gather the information needed to gain a clear view of key performance indicators (KPIs). Examples include revenue generated, net income, payroll costs, etc. Data analytics allows finance teams to scrutinize and comprehend vital metrics, and detect fraud in revenue turnover.

How do data management systems work? ›

Database Management Systems (DBMS) are software systems used to store, retrieve, and run queries on data. A DBMS serves as an interface between an end-user and a database, allowing users to create, read, update, and delete data in the database.

What is data management and how it works? ›

Data management is the practice of collecting, organizing, protecting, and storing an organization's data so it can be analyzed for business decisions. As organizations create and consume data at unprecedented rates, data management solutions become essential for making sense of the vast quantities of data.

What is master data management in finance? ›

Master data management (MDM) is a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise's official shared master data assets.

What are the 4 stages of data processing? ›

It is usually performed in a step-by-step process by a team of data scientists and data engineers in an organization. The raw data is collected, filtered, sorted, processed, analyzed, stored, and then presented in a readable format.

What are the 4 data management standards? ›

Specifically, there are four major pillars to keep in mind for good data management: Strategy and Governance, Standards, Integration, and Quality. Most importantly, in order to be data-driven, an organization must embrace data as a corporate asset.

What are the main activities of data management? ›

Data management includes all the processes for proactively collecting, organizing, protecting, storing, and sharing data. If your company understands the importance of data for decision-making, then chances are high that you already have some tools and processes in place for managing your data.

Is Excel for data management? ›

As a spreadsheet program, Excel can store large amounts of data in workbooks that contain one or more worksheets. However, instead of serving as a database management system, such as Access, Excel is optimized for data analysis and calculation.

What are the 5 types of master data? ›

The most commonly found categories of master data are parties (individuals and organisations, and their roles, such as customers, suppliers, employees), products, financial structures (such as ledgers and cost centers) and locational concepts.

What is the difference between CRM and MDM? ›

CRM is different from MDM since it supports business functions such as sales and service versus prioritizing the technology to perform data management. With CRM, one is managing the processes and lifecycle from prospect to purchase, service, may be an initial or first significant step into that style of solution.


1. Dynamics 365 Finance & Operations - Module 11: Introduction to Advanced Development Topics
(Microsoft Dynamics 365)
2. Data Management work space Import in Dynamics 365 Finance and Operations
(Dynamics Post)
3. Use data entities and data packages in Dynamics 365 for Finance and Operations
(Microsoft Dynamics 365)
4. Make The Most Of Data Management In Dynamics 365 Finance & Operations D365FO
5. How to import data Microsoft Dynamics 365 Finance and operations
(Nagendra Varma K)
6. Microsoft Dynamics 365 Finance and Operations Demo Series: Credit Management
(Tisski Ltd)
Top Articles
Latest Posts
Article information

Author: Domingo Moore

Last Updated: 10/06/2022

Views: 6335

Rating: 4.2 / 5 (53 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Domingo Moore

Birthday: 1997-05-20

Address: 6485 Kohler Route, Antonioton, VT 77375-0299

Phone: +3213869077934

Job: Sales Analyst

Hobby: Kayaking, Roller skating, Cabaret, Rugby, Homebrewing, Creative writing, amateur radio

Introduction: My name is Domingo Moore, I am a attractive, gorgeous, funny, jolly, spotless, nice, fantastic person who loves writing and wants to share my knowledge and understanding with you.