Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This feature by DocBits gives you an alternative to model classification as it allows you to write searchable regular expressions for a document type for classification and other purposes.
Document Type: The Regex Manager allows you to write regular expressions and this regex will then be searched for in the document, if it finds a match to the regex of a defined document, it then classifies that document to the corresponding document type. For example, if you wrote a regular expression to find “Gutschrift”. If DocBits found this term in a document it would classify that document as a credit note.
Document Origin: This lets DocBits know the country of origin of a document through regular expressions. For example, if a regular expression for a Spanish document contains the term “Factura”. If DocBits searches a document and finds this term then it would know that the document is of Spanish origin and classify it as such.
To find this feature in DocBits, from your Dashboard, navigate to Settings → Global Settings → Document Types. Within each of the created document types, there is a “Regex” option.
By clicking on “Regex” you will be taken to this menu, which displays the existing regex models that have been created as well as an “ADD” button for you to create new regex models.
DocBits excels in adapting document layouts according to their geographical origins while standardizing elements like currency formats based on user browser settings. Let’s explore how you can leverage the Layout Builder to customize layouts for different origins, such as the U.S. and Germany.
Currency and Format Standardization: Regardless of the original document’s currency or format, DocBits converts these elements into a standardized ISO format on the server, in line with the user’s browser settings.
Geographical Layout Customization: The system allows customization of document layouts based on their geographical origin. This means you can define specific fields and formats for documents from different countries.
U.S. Layout: For a U.S. invoice, you might include fields for city tax, aligning with the common tax structure in the U.S.
Germany Layout: In contrast, a German invoice layout may omit the city tax field, as it’s not a standard charge in Germany.
Select Origin Layout: In the Layout Builder, choose the base layout corresponding to the document’s origin.
Customize Fields: Adapt the layout by adding or removing fields. For instance, include ‘City Tax’ for a U.S. layout.
Apply and Test: Once customized, apply the layout to your documents and test to ensure accuracy.
Understand Regional Differences: Familiarize yourself with the tax and format nuances of different regions.
Consistent Updates: Regularly update your layouts to reflect any changes in regional regulations.
User Feedback: Utilize feedback from users in different regions to refine layouts further.
First of all, ensure that the Layout Builder feature is activated. This can be done by navigating to Settings → Document Processing → Module → Document Type and ensure that the Layout Builder slider is set too active as shown below.
After this is done you can access the Layout Builder via Settings → Document Types, once on this page, you can select from the various document types you have created and either select “Edit Layout” as shown below
or if you have sub-document types within a created document type you can select “Document Sub Types” and select “Edit Layout” for the sub document type layout you wish to edit as shown below.
After following the previous steps you will reach a page like the one shown below.
In order to upload a document to the layout builder, simply navigate to the right on the screen
Click on the “Upload Documents” button or drag and drop your desired document into the provided area
Groups can be created by selecting the following icon.
Groups allow you to create different sections on a layout, this makes it easier to separate different groups of data or information to make a layout easier to follow. You can create a title for each group so that a user can know what information they will find in that group.
These are a set of default fields that can be dragged and dropped into the layout builder and are available to you to create your desired layout. These include:
Text – This is a text box which creates a field in the layout that can have text entered into it once on the validation screen.
Label – This is a field that can be used to create uneditable text, this could be used to create sub-headings or any other desired uneditable text when on the validation screen.
Checkbox – This creates a boolean type field which can be checked or unchecked.
Multi Checkbox – This functions the same way as the “Checkbox” but can be used when the user knows they will be adding multiple checkboxes in one section.
Horizontal Separator – This creates a horizontal line on the layout that can be used to split up sections within a group on the layout.
Table of Checkboxes – This lets the user create a table of checkboxes consisting of custom x- and y-axis values, eg.
Button – This creates a clickable button on the validation screen within the layout that can be set to one of three functions, including: Export, Export mit Sonderwunsch or Reject.
Extracted Tables -This allows you to place an area on the document layout that illustrates the table that gets extracted from the document. For information click here.
Invoice Buttons – This element lets you drag and drop a set of buttons that are optimized for invoices. When on the validation screen, when you select the invoice type (either cost or purchase) the PO Matching or Auto Accounting will disappear accordingly.
QR Code Fields – This element allows you to drag and drop a block that will display all the extracted information from a document when a QR code is present.
The user is able to create their own custom groups and fields for a document type, this can be done when originally creating a document type but also by selecting “Fields” when on the Document Types page in Settings.
In order to create the above space on the layout, a “Label” from the Form Elements must be used in a special way. The reason for this is that the Layout Manager operates according to a 100 space per line system in that 1 space represents 1 percent of a line, this means that fields can only take up 100 spaces per line as show below.
This means that the user must build the layout line by line according to this rule. For example let's say you would like to add the fields “Name” and “Date” in the same line but would like the “Name” field to be larger. This can be done by dragging and dropping the “Text” field from the Field Elements drop down and naming each field “Name” and “Date” as shown.
The problem now exists that they are both the same size of 33 (this is the default size of all dragged and dropped fields) but you would like the “Name” field to be larger than the “Date” field and both fields should take up the entire line on the layout. Therefore, by following the 100 percent rule, you can set the “Name” and “Date” fields to any combination of 100 that you would desire. This of course depends on how large you would like each individual field but for the purpose of this example we will set the “Name” field to 70 and the “Date” field to 30, the results are:
This same rule applies to all fields in the Layout Builder.
Now that this rule has been explained, creating blank spaces will make more sense. As previously mentioned, in order to create a blank space you have to use a “Label” from the Form Elements.
For example, let’s say that you would like to create a blank space between these two fields.
Step one is to drag and drop a “Label” between these two fields, once added you can click on the “Label” field you just added and on the left you will be presented with the properties of the field. Now, in the same way you would create or change the name of a field as shown previously, you will remove any name from the “Label” property like so
The result from doing this will then be
There is now a gap between the two fields. This gap can be extended or shortened according to the 100 percent rule discussed earlier, and with these functions you can create any desired layout.
A guide to importing master data from INFOR LN.
A guide to importing master data from INFOR M3.
Functioning LN to DocBits dataflow
Correctly configured DocBits environment
In Infor, open the ION Desk application. In the left tab, go to Connect → Connection Points
This is where you will create the two connection points needed to import your data from LN that is required for Auto Accounting.
Click on “+ADD” to create a new connection point, select the API option like below
You will need to configure two separate API connection points, namely:
ChartOfAccounts
FinalFlexDimensions
The connection tab for your ChartOfAccounts connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add two BODs in this section for this connection point. These being Sync.ChartOfAccounts and Sync.CodeDefinition, to add these BODs do the following:
Sync.ChartOfAccounts
Click on the PLUS (+) icon
Select “Send to API”
Search for the Sync.ChartOfAccounts BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
At product, select the API Endpoint that you created for the environment you are working with, which you created in ION API. Search for the following API call, select the API call and press OK.
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.ChartOfAccounts BOD. Click on the PLUS icon to add the next and final BOD.
Sync.CodeDefinition (TotalFlexDimensions)
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the TotalFlexDimensions master data table.
The connection tab for your FinalFlexDimensions connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add one BOD in this section for this connection point. This being the Sync.CodeDefinition, to add this BOD do the following:
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the FinalFlexDimensions master data table.
You will need to configure two separate data flows for Auto Accounting:
ChartOfAccounts
FinalFlexDimensions
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the LN company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following documents need to be added to the data flow:
Sync.ChartOfAccounts
Sync.CodeDefinition
This is where you add the ChartOfAccounts API connection point which you created earlier, the configuration for this should look similar to this
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the LN company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following document needs to be added to the data flow:
Sync.CodeDefinition
DocBits (FlexDimensions)
This is where you add the FinalFlexDimensions API connection point which you created earlier, the configuration for this should look similar to this
Once all the above is completed, you will need to navigate to Infor LN and trigger the BODs in order for the various master data you need for Auto Accounting to arrive in DocBits.
From the above menu, in the left menu tab, select Common → BOD-Messaging → Publish BODs → Publish Financial Master Data. From the following menu you will find the FlexDimensions and ChartOfAccounts BODs to publish.
Select the following BODs to publish by simply checking each box, no other changes need to be made as we want to publish all of these BODs so that the master data is complete in DocBits.
Once both of the above BODs are selected, navigate to the Options tab
Once on the Option menu, select the following options and select PROCESS to publish the BODs.
Once this is done you should see the three separate master data tables in your DocBits environment(s) under Master Data Lookup:
chartofaccounts
totalflexdimensions
finalflexdimensions
Functioning M3 to DocBits dataflow
Correctly configured DocBits environment
In Infor, open the ION Desk application. In the left tab, go to Connect → Connection Points
This is where you will create the connection point needed to import your data from M3 that is required for Auto Accounting.
Click on “+ADD” to create a new connection point, select the API option like below
You will need to configure the API connection point called:
ChartOfAccounts
The connection tab for your ChartOfAccounts connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add two BODs in this section for this connection point. These being Sync.ChartOfAccounts and Sync.CodeDefinition, to add these BODs do the following:
Click on the PLUS (+) icon
Select “Send to API”
Search for the Sync.ChartOfAccounts BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
At product, select the API Endpoint that you created for the environment you are working with, which you created in ION API. Search for the following API call, select the API call and press OK.
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available here.
Once you have completed the above steps, you will have successfully configured the Sync.ChartOfAccounts BOD. Click on the PLUS icon to add the next and final BOD.
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available here.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the M3FlexDimensions master data table.
You will need to configure the following data flow for Auto Accounting:
ChartOfAccounts
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the M3 company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following documents need to be added to the data flow:
Sync.ChartOfAccounts
Sync.CodeDefinition
For the second route of the dataflow (according to the routing in the data flow), we apply a filter with the following configuration.
This is where you add the ChartOfAccounts API connection point which you created earlier, the configuration for this should look similar to this
Once all the above is completed, you will need to navigate to Infor M3 and trigger the BODs in order for the various master data you need for Auto Accounting to arrive in DocBits.
Start by pressing Command + r, to open the prompt menu, type “evs006” and press enter.
The following menu will be displayed to you
To add the various BODs you will need to enter the BOD nouns and Table names for each BOD individually.
The BODs you need to add include:
ChartOfAccounts
CodeDefinition
CodeDefinitionAccountingDimension
To add the new BOD after, after entering the BOD Noun and Table Name, press the pLUS icon indicated below
The BOD nouns and Table names are as follows.
ChartOfAccounts
BOD Noun: ChartOfAccounts
Table Name: FCHACC
CodeDefinition
BOD Noun: CodeDefinitionAccountingDimension
Table Name: FCHACC
After adding each BOD, right click on the BOD you added, select Related and then Run.
You will be taken to this screen.
Change BOD Verb to “sync” and press NEXT.
Once you press NEXT, you will get a notification indicating that the BOD publishing process has begun.
In order to import the m3costingelement table into DocBits, you need to do the following.
From the M3 Homepage, type Command + r and search the “PPS280” prompt.
Select any of the lines displayed to you. On the next menu, select TOOLS and “Export to Excel”
Select “Export all Rows” and then press EXPORT.
Once downloaded, you will need to alter the excel file before converting it into a CSV file.
You will need to open the excel file, it will look similar to what is shown below.
From this excel sheet you only need the first 2 columns, alter the excel sheet so that the end result looks as follows.
Once this is done, save the file as a CSV.
Once you have your CSV file, go to the following webpage. This depends on which environment you are using:
Prod: http://api.docbits.com/
Sandbox: http://sandbox.api.docbits.com/
Stage: http://stage.api.docbits.com/
Demo: http://demo.api.docbits.com/
Dev: http://dev.api.docbits.com/
Here you will manually upload the CostingElement table via an API. Click on the Authorise button.
Here you will need to insert the API Key from your DocBits environment. This is located in Settings under Integration.
Once complete, search for the API called master_data_lookup/import_data and fill in the required information. Once complete, click EXECUTE to trigger the API.
If done correctly, the M3CostingElement table should now be in your DocBits environment. Auto Accounting for M3 has now been configured for your environment.
If a customer requires a new document type or additional fields to be added to an existing document type layout, this section will go through all the information required to do so.
In DocBits you will find the SETTINGS menu in the upper bar on the DASHBOARD.
If you are logged in to DocBits as an admin, you will find all fields of a document that can be extracted under the respective document type.
Open the menu for Document Types.
In the following overview you will find all standard document types available for you
Activate/Extraction Type
To the right of each document type, you will see Activate and Extraction Type sliders.
Activate: This document type is active in your DocBits environment.
Extraction Type: This slider allows you to enable or disable a set of predefined rules for the document type when it is processed by DocBits. By selecting the gear icon to the right of the slider, the following menu will appear.
To see which fields can be extracted, for example from an invoice, click on FIELDS for this document type.
Field Settings
Here you will find all the fields that can be extracted
You can also CREATE FIELDS like freight, postage or any field with an amount you want to extract from your invoices.
For each field you can check the boxes if they are:
REQUIRED: Here you can define if the field must contain a value to continue.
READ ONLY: Here you can define if a field can only be displayed but not edited.
HIDDEN: Here you can define whether a field should be hidden or displayed in the extraction view.
FORCE VALIDATION: Here you can define whether a field must always be validated manually, even if it has been read 100% by DocBits.
OCR and MATCH SCORE: Setting as described below, per field.
FORMULA: Creation of a formula per field.
If all settings are made and should be saved, please confirm this with the SAVE SETTINGS button at the bottom of the page, otherwise the settings will not be applied.
Recognition Settings
OCR
Here you can set the sensitivity of the OCR (Optical Character Recognition) function for all fields at once. This value determines the sensitivity with which a field is marked in red if it could not be extracted with 100% certainty (OCR related!).
Match Score
This is where you can set the sensitivity of the MATCH SCORE function for all fields at once. This value determines when a field is marked in red if DocBits has not extracted the field with 100% probability. In this case the field needs to be validated manually.
The button RESTORE DEFAULTS will set back both values to “50”.
Profile
Here you can define the profile that shall be used. Either Default or ZUGFeRD. In profile ZUGFeRD there are predefined fields that are mandatory for this type of invoice. If you do not explicitly use ZUGFeRD, please select “Default”.
Format: JSON
Purpose: This step involves defining the structure of the EDI data. It includes specifying segments such as SAC
, N1
, and PO1
, and details the fields contained within each segment. For segments that contain nested structures, loops are defined to properly organize the data hierarchy.
Format: XSLT
Purpose: This step involves transforming the structured JSON data into a structured XML format, specifically tailoring the output to meet the requirements for further processing or integration. This transformation helps in extracting precise information like acknowledgement types, order details, and conditional elements based on specific values.
Format: XSLT (outputting HTML)
Purpose: Converts the XML data from Step 2 into an HTML format for previewing the transformed data in a readable and visually appealing format. The HTML layout includes styles for presentation and structures data like purchase orders, supplier details, and order terms for easy viewing.
Format: JSON
Purpose: Specifies JSON paths for extracting key values from the XML data produced in Step 2. These paths are used to retrieve specific data points such as purchase orders and currency, which are crucial for downstream processing and integration into other systems.
This updated sequence ensures a thorough process, transforming raw EDI data into structured, actionable information using JSON for data structuring, XSLT for transformation and HTML preview, followed by JSON paths for data extraction and integration.
You will need to create the DocBits API connection point in order to create the data flow later.
First, in InforOS, navigate to ION Desk → Connect → Connection Points
Once here, you will need to create a new connection point.
Select API
Give the connection point a name and description that describes its nature and its environment. Under the Connection tab, import the service account you created for the environment you are working with.
Next, switch to the Documents tab. You will need to add the following BODs to the connection point.
Ack-SupplierInvoice
This BOD is used to signal on DocBits that an error has occurred within Infor. The configuration for these two BODs should look similar to the following (API Call Name changing for each)
Sync.PurchaseOrder
The configuration for this BOD should look similar to the following
Sync.ReceiveDelivery
The configuration for this BOD should look similar to the following
Once these BODs are configured, you can save the connection point by pressing the icon located right to the back button.
The data flow will look similar to the following
(The reason for multiple DocBits APIs is due to each connection representing a different environment meaning, depending on the amount of environments you have, your data flow could differ slightly)
For the purpose of this explanation we will use the example of having four separate environments.
The start of the data flow consists of your LN application
Here you will add an application and select the DocBits API(s) you created earlier
The configuration should look as follows
Once all the above is completed, you will need to navigate to Infor LN and trigger the BODs in order for the various master data you need for Suppliers and Purchase Orders to arrive in DocBits.
From the above menu, in the left menu tab, select Common → BOD-Messaging → Publish BODs → Publish Order Management Transactional Data
Select the PurchaseOrder tab and check the box.
From the LN homepage, in the left menu tab, navigate to Common → BOD-Messaging → Publish BODs → Publish Logistics Master Data
Select the PartyMaster tab and check the Supplier → Buy-from or SupplierPartyMaster box.
Once all the correct BODs have been checked for publication, select the Options tab.
The following options should be selected.
Once this is complete, press the PROCESS button and the BODs will be triggered. A message will appear on screen to notify you that the BODs have been triggered.
If done successfully, the Supplier and Purchase Order tables should now be available under Settings → Master Data Lookup.
\
Open ION Desk → Connect → Connection Points
You will need 4 connection points for this dataflow, 3 API connection points for the different tax code categories (full, reduced and free) and an Application connection point representing your LN company.
In order to create new connection points, select the “+ADD” button”
Select “API” at the bottom of the list of options
You will be taken to the following page
This is where you will enter all the details of the TaxCode connection point. For each of the three connection points you will be creating do the following
Enter a Name: TaxCodeFull, TaxCodeReduced, TaxCodeFree
Description: This can be the same as the Name or similar
Import a service account you created.
Switch to the “Documents” tab and select the PLUS icon to add the BOD we need, like below
Search for the BOD
Search for the BOD called “Sync.LnTaxCode”, click on it and press “OK” to add the BOD.
Move on to the ION API section. Under API Call Name you can use the name of the BOD, Sync.LnTaxCode
Press the “SELECT” button
Select the API you configured for the environment you are working with and search for the following API. Once you have selected it, press “OK”.
Next, switch to the Request Body tab.
Here is where there will be a slight change for each connection point, this is seen in the field mappings you will assign to each tax code as they differ slightly.
In the field_mappings row, under value, is where you will put the specific field mappings for the specific tax code connection point you are creating (full, reduced or free). These mappings are available at https://docbits.com/doc/field-mappings/.
The end result should look the same or similar to the image above. Once this is done, click the SAVE option located here.
Navigate to ION Desk → Connect → Data Flows
Click on “+ADD” and select document flow
Create the following data flow by dragging and dropping the components from the to menu
Here is where you will select your LN company, the final result should look similar to the following
This is where you will add the Sync.LnTaxCode BOD from earlier, the result looks as follows
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
This is where you select the API connection points you created earlier, this is done by selecting the API under the “Select ION API Connector” drop down menu.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
Once all the above is done, SAVE and ACTIVATE the data flow by pressing the following buttons
Open LN in Infor
Navigate to Common → BOD Messaging → Publish BODs → Publish Financial Master Data
Select MORE and click on LnTaxCode
Tick the checkbox to select the LnTaxCode BODs
Navigate back to the OPTIONS tab, your configuration should look as follows
When you would like to publish the BODs, select PROCESS.
The end result should give you a similar table in your DocBits environment.
You will need to create the DocBits API connection point in order to create the data flow later.
In InforOS, navigate to ION Desk → Connect → Connection Points
Once here, you will need to create a new connection point.
Select API
Give the connection point a name and description that describes its nature and its environment. Under the Connection tab, import the service account you created for the environment you are working with.
Next, switch to the Documents tab. You will need to add the following BODs to the connection point, not all are necessary for the supplier and purchase order master data but will be useful when other features such as Auto Accounting need to be implemented.
For now we will only focus on the necessary BODs, these being: Sync.RemitToPartyMasterData, Sync.SupplierPartyMaster and Sync.PurchaseOrder.
Sync.RemitToPartyMasterData and Sync.SupplierPartyMaster
The configuration for these two BODs should look similar to the following (API Call Name changing for each)
Sync.PurchaseOrder
The configuration for this BOD should look similar to the following
Once these BODs are configured, you can save the connection point by pressing the icon located right to the back button.
The data flow will look similar to the following
(The reason for multiple DocBits APIs is due to each connection representing a different environment meaning, depending on the amount of environments you have, your data flow could differ slightly)
For the purpose of this explanation we will use the example of having four separate environments.
The start of the data flow consists of your M3 application
Configuration of the filter looks as follows
(The accounting entity ID of course being unique to your organization)
Here you will add an application and select the DocBits API(s) you created earlier
The configuration should look as follows
Navigate to the Infor M3 application
Once at the main menu, type Command + R to open the command prompt search box. Then type evs006 and search.
Once on this page, you will need to add the SupplierPartyMaster, RemitToPartyMaster and PurchaseOrder to the list.
BOD noun: SupplierPartyMaster
Table: CIDMAS
BOD noun: RemitToPartyMaster
Table: CIDMAS
BOD noun: PurchaseOrder
Table: MPHEAD
For each case you will need to press the plus icon to add them to list.
After you have added each of the BODs, right click on the BOD noun of the BOD and select Related → Run
You will be taken to the following menu, where you will need to change BOD verb to Sync and then press NEXT to trigger the BODs.
Once you trigger the BODs, you will get a notification confirming this.
If done successfully, the Supplier and Purchase Order tables should now be available under Settings → Master Data Lookup.
A guide to exporting documents in DocBits.
A guide to training new documnets with DocBits.
M3 export mapping file is divided in 5 sections and each section is further divided into 2 sections
Header
Header Static Fields
Header Fields
Tax Lines
Tax Line Static Fields
Tax Line Fields
Receipt Lines
Receipt Line Static Fields
Receipt Line Fields
Order Charge Lines (Additional Amounts)
Order Charge Static Fields
Order Charge Fields
Cost Lines
Cost Line Static Fields
Cost Line Fields
Adding New Field:
First we need to add the M3 api field name to the relevant section’s fields list property (e.g. StaticFields, HeaderFields, InvoiceTaxFields)
Define the static value or document field name for the api field with appropriate prefix for the section
Example 1: To define a static value of AAA for the M3 api field DIVI. First we added DIVI to StaticFields property. Then we add a line SF_DIVI = AAA as SF_ is the prefix for static fields
Example2: To map header field IVDT (invoice data) to invoice_date field of DocBits. First we add IVDT to HeaderFields property. Then we add a line HF_IVDT = invoice_date as HF_ is the prefix for header fields
Removing Field:
Just remove the field from section’s field list property and remove the line defining value for the field.
Available M3 fields can be checked by opening appropriate screen in M3.
Similarly you can get field names for lines
Fields List Property: StaticFields
Section Fields Prefix: SF_
Available Fields: You can map any M3 api field with any static value
Fields List Property: HeaderFields
Section Fields Prefix: HF_
Available Fields: You can map any DocBits field to any M3 api field
Fields List Property: InvoiceTaxStaticFields
Section Fields Prefix: IT_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceTaxFields
M3 Fields Prefix: ITF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: line_number, tax_amount, tax_rate, net_amount, gross_amount, tax_code_full, tax_code, tax_country
Fields List Property: InvoiceReceiptStaticFields
Section Fields Prefix: IR_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceReceiptFields
M3 Fields Prefix: IRF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: packing_slip, purchase_order, line_number, line_sequence, delivery_number, delivery_line, amount, quantity, total_net_amount
Fields List Property: OrderChargeStaticFields
Section Fields Prefix: OC_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: OrderChargeFields
M3 Fields Prefix: OCF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: ledger_account, dimension_2-7, amount, quantity, quantity2, position
Fields List Property: InvoiceCostStaticFields
Section Fields Prefix: IC_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceCostFields
M3 Fields Prefix: ICF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: ledger_account, dimension_1-12, amount, quantity, quantity2, position
New update
Once at the home screen, click on the burger menu and select ION API
After opening ION API, click on Available APIs in the left menu
Click on “+ADD” block
Then “+ Create New”
The information you insert should look like this
FYI: The description has multiple environments as this will be used for multiple environments and the icon and its color always remain the same.
Next, select the + at the bottom of the screen
This Target Endpoint URL can be found at doc2api.cloudintegration.eu
The information underneath this field should look as follows.
Once you have filled in this information, to the right of these fields there is a “Target Endpoint Security” field with a drop down. Select API Key from this drop down.
A table will then appear underneath this drop down, fill in the following information. The key value is specific to the customer and environment, it can be found within DocBits.
From the Dashboard → Settings → Integration → API Key
Copy and paste this into the Key Value field in InforOS
Once this is complete, press the following icon to save the configuration
You are not yet completely done with the configuration.
Go back into the API you just configured and enter the details like below
Go to the Documentation tab at the bottom and click on the +
Enter the following details:
Name = DocBits-”environment”
Type = Swagger
URL = go to doc2api.cloudintegration.eu, once on this page, open the following link
Copy the URL and use it for the URL field in InforOS
Save it once you have entered the information for all the fields. There should be a loading icon for a while but the end result should look like this
The same process would be used to create the endpoints for other environments.
FYI: If in the future you are struggling to find these endpoints, in ION API go to API Metadata and click on this icon to refresh the API metadata.
From the DocBits Dashboard of the required customer, go to Settings → Export, to add a new document type for export, do as follows:
Click on the “+ New” button
Select “Infor IDM + ION BOD”
You will then be taken to this menu where you need to give the new exportable document type a Title, select the document type from the dropdown and add all the necessary Mapping Files (ION, IDM and BOD).
Download a BOD Mapping File and open it in your applicable file editor of choice to edit it. For this walkthrough, VSCode is used.
Change the company to the correct one (SFV_AccountingEntityID) and edit location ID if needed.
Check the document code by going to the field settings of the document type you are trying to export (found in URL of field settings of document type in DocBits like below
Lastly, edit the SFV_LogicalID which can be found in INFOR ION DESK → Connect → Connection points and select the DocBits_Export or similar connection point and within that page you will find the Logical ID you need.
If this Connection Point does not yet exist, you need to create one.
First, go to ION Desk → Connect → Connection Points and click on the “+ Add” button”
Then select the “IMS via API Gateway” option
You will be taken to the above screen where you must now fill in the necessary information, the name should be something like “DocBits_export” or similar.
For “ION API Client ID” you enter the same Client ID you obtained earlier for the ION Mapping File.
Then select the Document tab of the Connection Point creation menu and add the following documents by pressing the “+” sign, this will only become useful later.
Once you save this Connection Point you will obtain the Logical ID as shown below
Then insert this Logical ID into the appropriate section of the BOD Mapping File and save the file.
Drag and drop the file into your export configuration in DocBits. This is available at Settings → Export.
The export module is located, on the Dashboard, under Settings → Document Processing → Export.
To add a new export configuration, select “+ New”
Select the method you would like to use for your export configuration.
Once you have selected the method you would like to use, you will need to upload the various information and files required for that method of exporting.
Once you have one or many export configurations in your DocBits, you have the option to activate or deactivate configurations depending on your needs.
The configuration below is activated, indicated by the green dot to the left of the configuration name.
To deactivate the export configuration, select the options button to the right of the configuration as shown below.
You are given three options
Deactivate: The export configuration will no longer be functional (indicated by a red dot next to the configuration name).
Edit: Make changes to the details of the configuration.
Delete: Delete the configuration.
Once obtained, open the file in your applicable file editor of choice. For this walkthrough, VSCode will be used.
Check the document type code is as it is in DocBits (like with the BOD Mapping File it should match the name of the doc type in the URL of the field settings) and also check the name of the document type as it should be in Document Manager (IDM) in Infor.
FYI: It states that the name of the document type in IDM is M3_SupplierInvoice, this is due to this being an example from an M3 instance. This can change depending on if you use LN or M3, as well as your specific IDM configuration.
Check the company ID, and check Entity ID (SF_MDS_EntityType) this value should be the same as it was in the BOD Mapping File.
Ensure the IndexFieldFromDocBits=IDMAttributeID (check if DocBits on the left in the field settings matches IDM on the right in Document Type → Attributes).
Go to Document Manager and select the name of the current document type you are trying to export, for example, Supplier Invoice.
Click the above icon and then click Administration → Document Type and then find the document type you need in the list
As shown below, you will then see the doc type name as it is in INFOR
Make sure this is how the name is shown in the IDM Mapping File
Once the file is prepared, upload it to your export configuration in DocBits. This is available at Settings → Export.
You have created:
An ION API Endpoint
An ION API File
A BOD Mapping File
An IDM Mapping File
Before you setup the dataflow, you need to import the mapping files into InforOS.
In ION Desk → Connect and open Mappings
Click on the Import icon
From here you need to select the various mapping files you will need which include: SyncCaptDoc_SyncSuppInv, SyncSupplierInvoice_LoadSupplierInvoice, and LoadSupplierInvoice_ProcessSupplierInvoice.
Once you have imported all the mappings files, make sure to approve each of them by clicking the tick icon within each of their squares on the Mapping dashboard.
The next step is to setup the Data Flow in ION Desk, navigate to the ION Desk application and select Data Flow → + ADD → Document Flow like below
You will then see this page, this is where you will build the flow of information from DocBits to M3
An M3 data flow will look similar to what is shown below (there are 3 DocBits applications and APIs due to it being used for 3 separate environments).
All parts of the chain are dragged and dropped from the top section
In the chain, DocBits and M3 are both Applications whereas in between them there are mappings that convert the data into a form that can be understood by the next section of the train and “map” the information so that it goes to where it is needed or meant too.
Give it the appropriate name such as “DocBits” then select the plus sign and search for the connection point you created earlier such as DocBits_Export or similar and click on it.
To create this connection point, go to ION Desk → Connect → Connection Points
Click “+ Add”
Select “IMS via API Gateway” and fill in the following information
The ION API Client ID is in the ION API File you created at How to Create an ION API File, this can be found by the “ci” value.
Switch to the document tab, and add the Sync.CaptureDocument BOD to the DocBits connection point like below.
Then save the connection point by pressing the disk icon in the upper-left corner.
Navigate back to the Dataflow section of ION Desk to access your dataflow. Your DocBits application should look similar to what is shown below.
The first mapping node should look as follows
The second mapping node should look as follows
The third mapping node should look as follows
There should already be a M3 or similarly named connection point created in INFOR so just like the DocBits Application you select it by clicking the “+” sign and it should look as follows
You will first need to create this API as a connection point which is done by:
ION Desk → Connect → Connection Points
Click “+ Add” and select API
The information you fill in should look like the following
Switch to the document tab and add the following configuration
Acknowledge.SupplierInvoice
The configuration for this BOD is as follows
Make sure you have selected the “Send to API” option in the above menu.
Lastly ensure that you alter the request body as well to look like this.
Repeat this process for the rest of the BODs, each configuration is shown below.
Sync.PurchaseOrder
The configuration for this BOD is as follows
Sync.RemitToPartyMaster
The configuration for this BOD is as follows
Sync.SupplierPartyMaster
The configuration for this BOD is as follows
The following configurations should look as follows:
The last icon should be empty as it is not carrying any document or information.
Once you have added all necessary nodes to the data flow, press this button to activate the data flow (after saving the data flow by pressing the hard drive icon).
If you only require your document to be exported to IDM or Document Management in INFOR, the configuration is similar to that of the export to IDM + LN/M3 but does not require any BOD Mapping file as there is no export to LN/M3 required.
Select the following option for exporting.
You will be required to upload an ION API file as well as an IDM Mapping file.
How to obtain these has been discussed earlier in this documentation. ION API file and IDM Mapping file.
To define tables and columns on a document please import a document, open it and go to the table extraction view like already known (via “Line Items”).
You will end up in following screen where you can activate the Training Mode:
Via the “Edit” button table selection mode will be activated and you will be able to edit the document shown on the left side:
In general your are now able to use the autodetect tables functionality and the system will automatically define the tables on the document:
If the system is not able to find the tables automatically you can manually define the tables on the document like shown in the video below:
Once the tables are defined you can manually define the columns via the following button:
Define and sort the columns via drag and drop on the document. Repeat this for the tables on the other pages as well. As an alternative if the tables are equal on each page you can use the following button to use the same column for all tables defined in the document. This functionality copies the columns that you have defined to all other pages as well:
If the document is very confusing, it can be zoomed in and out using the “Zoom in” and “Zoom out” functionality. This is helpful, for example, when a lot of information is close together and this makes it difficult to define the columns:
If you defined a column by mistake it can be selected by clicking it and deleted via the following button:
Note: The button is only visible if the column is selected.
If you are done defining all tables and all columns on your document use the “Save” button to activate your changes and to extract the data from the document:
The data of all defined areas will be extracted and shown on the right side:
In the next section you will learn how to adjust the data that has been extracted.
You have created:
An ION API Endpoint
An ION API File
A BOD Mapping File
An IDM Mapping File
Before you set up the data flow, you need to import the mapping files into InforOS
In ION Desk → Connect and open Mappings
Click on the Import icon
From here you need to select the various mapping files you will need which include: SyncCaptDoc_SyncSuppInv, SyncSupplierInvoice_LoadSupplierInvoice, and LoadSupplierInvoice_ProcessSupplierInvoice.
Once you have imported all the mappings files, make sure to approve each of them by clicking the tick icon within each of their squares on the Mapping dashboard.
The next step is to setup the Data Flow in ION Desk, navigate to the ION Desk application and select Data Flow → + ADD → Document Flow like below
You will then see this page, this is where you will build the flow of information from DocBits to M3
An LN data flow will look similar to what is shown below (there are multiple paths due to each individual path being meant for a specific document type, for this explanation we will focus on the invoice data flow).
All parts of the chain are dragged and dropped from the top section
In the chain, DocBits and LN are both Applications whereas in between them there are mappings that convert the data into a form that can be understood by the next section of the dataflow and “map” the information so that it goes to where it is needed or meant too.
Give it the appropriate name such as “DocBits” then select the plus sign and search for the connection point you created earlier such as DocBits_Export or similar and click on it.
To create this connection point, go to ION Desk → Connect → Connection Points
Click “+ Add”
Select “IMS via API Gateway” and fill in the following information
The ION API Client ID is in the ION API File you created at How to Create an ION API File under the “ci” value.
Switch to the document tab, and add the Sync.CaptureDocument BOD to the DocBits connection point like below.
Then save the connection point by pressing the disk icon in the upper-left corner.
Navigate back to the Dataflow section of ION Desk to access your dataflow. Your DocBits application should look similar to what is shown below.
The first mapping node should look as follows
The second mapping node should look as follows
There should already be an LN or similarly named connection point (for the appropriate LN company) created in INFOR so, just like the DocBits Application you select it by clicking the “+” sign and it should look as follows
The following configurations should look as follows:
The last icon should be empty as it is not carrying any document or information.
Once you have added all necessary nodes to the data flow, press this button to activate the data flow
This is created in INFOR ION API → Authorized Apps and an app like below should be shown
If not, then you need to create a new Authorized App. This can be done clicking the plus sign
Once you have entered the Name (DocBits_*Environment*) and Type (case specific) of the new Authorized App, you will be taken to a page where the Client ID and Secret have been generated automatically.
The information you fill in should be similar to what is shown above, it is important to enable the “Issue Refresh Tokens” slider at the bottom of the page.
Click “Download Credentials” to download the ION Mapping File.
Once you have downloaded the ION API file from Infor, you can upload it by going to Settings → Document Processing → Export like below
\
In the table extraction view, you will find the menu item Settings in the upper action bar (make sure that the training mode is activated). If you click on the gear icon, a window will open in which you will find the Advanced Settings.
Below functionalities are available in general settings:
Here you can define the number of lines of a table header. For example, the table header line can be two lines:
Accordingly, the value in “Header row count” is set to two
Why is this needed? It might be that DocBits does not recognize the second line in the table header as part of the header line. In this case, it incorrectly inserts it into the table as an extracted value. This can be easily prevented with this function.
Example before
Example after
In this example, the item description in the table spans several rows, but you only need the first one. To extract only this and include it in the Description column, select Move Extra Rows to Trash.
After naming the columns and mapping them to position, you get the following result
The functionalities below are available in the advanced settings:
Enter the minimum number of rows in your grouped column here.
In this table you see six rows of which only three are relevant for you. In the first two columns there are two criteria that have to be extracted separately. These will be your mapped columns all the other ones have to be trained as custom columns. And this is how it works step by step:
Select the two header rows as well as two minimum grouped rows as these should be grouped to one row.
Also select the Move extra rows to Trash option to be able to train all the other columns as custom columns.
Name the first column Position and group on that one.
After naming all the columns and training the values, this is your result:
If you want to combine all the rows above the grouped attribute, check the box here.
In this example, the table starts with a row that is above all other information but also needs to be extracted along with the information below it. It could be that DocBits (DOC²) extracts this row as an additional row and the grouping of the information, e.g. by position, does not work properly.
After grouping on net amount, checking the box, selecting the Move extra rows to Trash option
After naming all the columns, this is your result.
When using our table extraction tool, you can choose between training mode and correction mode. Here’s what you need to know about each mode:
In training mode, mapped columns are read-only, and you cannot manually change the text. Additionally, the delete row button is not visible.
In correction mode, you can delete and add rows, and you can manually change the text of mapped columns.
Here’s an example of a table in correction mode, with the first row deleted and a new row added:
If you need to manually map columns to rows (fields) on an extracted document, you can do so easily in DocBits. Follow our step-by-step guide below to learn how to add a new column to a table in DocBits.
To get started, import your document into DocBits and open it. Then, navigate to the table extraction view by clicking on the “LINE ITEMS” button.
To add a new column to your table, you’ll need to activate training mode. Click on the “TRAINING MODE” button to do so.
Once you’re in training mode, you can create a new column by clicking on the “ADD COLUMN” button. In the window that appears, you can specify a name for the column, set whether it’s optional or mandatory, and choose the column type (STRING, AMOUNT, or DATE).
After you’ve created your new column, you can use it for manual mappings by following our guide on Manual Row Selection.
A guide on testing DocBits features once they are configured.
Functional Documentation
This module contains functions for manipulating document data and performing various operations related to document fields.
Description: Sets the value of a field in the document data.
Parameters:
document_data
(dict): The document data containing field information.
field_name
(str): The name of the field to set.
value
: The value to set for the field.
Example:
set_field_value(document_data, "name", "John Doe")
Description: Sets the value of a date field in the document data.
Parameters:
document_data
(dict): The document data containing field information.
field_name
(str): The name of the date field to set.
value
: The date value to set in ISO format (e.g., "2020-12-31").
Example:
set_date_value(document_data, "date_of_birth", "1990-05-15")
Description: Sets a custom attribute of a field in the document data.
Parameters:
attribute_name
(str): The name of the attribute to set.
value
: The value to set for the attribute.
Example:
set_field_attribute(document_data, "address", "is_verified", True)
Description: Updates the status of a document with the specified ID.
Parameters:
doc_id
(str): The ID of the document to update.
user
: The user performing the update (either user ID or UserAuthentication object).
org_id
: The ID of the organization to which the document belongs.
status
(str): The new status of the document.
message
: Optional message associated with the status update.
doc_classification_class
: Optional document classification class.
Example:
update_document_status_with_doc_id("123456", user, org_id, "approved", "Document approved")
Description: Checks if a supplier is valid based on the provided criteria.
Parameters:
user
(UserAuthentication): The authenticated user.
filter_data_json
: Filter criteria for validating the supplier.
sub_org_id
: Optional sub-organization ID for filtering.
Example:
is_supplier_valid(user, {"name": "Supplier Inc."})
Description: Marks a field as invalid in the document data.
Parameters:
document_data
(dict): The document data containing field information.
field_name
(str): The name of the field to mark as invalid.
message
(str): The validation message for the field.
code
(optional): Error code for the validation (default is None).
Example:
set_field_as_invalid(document_data, "email", "Invalid email format", "EMAIL_FORMAT_INVALID")
\
This guide will show you how to make HTTP requests to your DocBits organization via Postman. It is easy to use and very useful for organization administrators.
First, download Postman to your system and sign in/register.
Now follow this step-by-step guide to learn how HTTP requests work in Postman.
Authorization in Postman
Before you can create your HTTP requests, you need to enter your API key from DocBits to authorize them.
Click on the `Authorization` tab and choose `API Key` as authorization type.
Fill in the values. Enter “X-API-key” in the `Key` field and your API Key as value (found in DocBits Settings menu under Integration) Select Add to `Header`.
It should look like this:
Available at https://api.polydocs.io
Click on Authorize in the upper right corner
Enter your API Key and confirm by clicking `Authorize`
Click on Workspaces and create a new workspace (you can name it whatever you want
You have to select the visibility which determines who can access this workspace.
After making your selection and clicking `Create Workspace` select Collections on the left side of the application and create a new collection for your HTTP requests by clicking `+`.
In this collection, you can add multiple HTTP requests. To do this, click on the 3 points of the collection and select `Add request`.
The GET method is very useful for getting information about users, sub-organizations, processed documents, and much more.
Choose the GET method in your HTTP request.
Authorize yourself as described above.
Open https://api.polydocs.io and add the path of the function behind the Polydocs URL. For example:
Now paste this link in the text box next to the GET method in Postman.
Click `Send` and you should receive all the information about every user in your organization.
The POST method is usually used to create users or organizations, for example. This method inserts information into the database.
Create User
Select the POST Method.
Authorize yourself as described above
Open https://api.polydocs.io and add the path of the function behind the Polydocs URL. In this case:
Now paste this link into the text box next to the POST method in Postman.
Select the `Body` tab in your HTTP request and enter the keys and the values for each credential that has a red asterisk next to its name.
When you’re done, it should look like this:
If you want to create an admin account, set the `is_admin` value to true.
Finally, click `Send` and you can see all the credentials you set in the response below. This means the user has been created.
You can also use the POST method to upload a document to DocBits.
Select the POST Method.
Authorize yourself as described above.
Open https://api.polydocs.io and add the path of the function behind the Polydocs URL.
In this case:
Now paste this link into the text box next to the POST method in Postman.
Select the `Body` tab in your HTTP request and choose `form-data`
Enter file into the `KEY` field where you will find the hidden File dropdown. Select `File` and move to the `VALUE` field where you are able to select your file by clicking `Select Files`.
When you click `Send`, you should see “success”: true in the response.
It should look like this:
The DELETE method is used to delete, for example, users, organizations and so on.
Select the DELETE Method
Authorize yourself as described above.
Open https://api.polydocs.io and add the path of the function behind the Polydocs URL.
For example:
Now paste this link in the text box next to the DELETE method in Postman.
Replace the {user_id} at the end of the URL with the actual user ID you want to delete. (You can get the user_id using the GET method).
If you included the user_id in the URL, you don’t need to add a body key and value for it.
When you click `Send`, you should see “success”: true in the response.
It should look like this:
The PUT method is mainly used to update user or organization data. It is very easy to understand and use.
Select the PUT Method.
Authorize yourself as described above.
Open https://api.polydocs.io and add the path of the function behind the Polydocs URL.
For example:
Now paste this link in the text box next to the PUT method in Postman.
Replace the {user_id} at the end of the URL with the actual user ID you want to delete. (You can get the user_id using the GET method).
In the body, enter “email” as the key and the new email address as the value.
Then just press `Send` and you should see “success” in the response.
Required Information | Description |
---|---|
URL starts with https://mingle-portal.eu1.inforcloudsuite.com/<TENANT_NAME> followed by your personal extension
Choose the option Cloud Identities and use your login details
After login you will have access to the Infor Cloud. In this case we enter this page, but on the burger menu you will find access to all applications.
On the right hand side of the bar menu, you will find the user menu and there you can access the user management
Then you need to select in the left hand side menu the option Security Administration and Service Provider.
You will see this window with the Service Providers.
Now click on the “+” sign and add our DocBits as Service Provider.
Log in on URL https://app.docbits.com/ with the login details you received from us.
Go to SETTINGS (on top bar) and select INTEGRATION, under SSO Service Provider Settings you will find all the information you need for the following steps.
Download the certificate
Filling the Service Provider with the help of SSO Service Provider Settings in DocBits
When you have filled out everything remember to save it with the disk icon above Application Type
Enter the service provider DocBits again.
Click on view the Identity Provider Information underneath.
File looks like this: ServiceProviderSAMLMetadata_10_20_2021.xml
Import the SAML METADATA in the SSO Settings.
Go to IDENTITY SERVICE PROVIDER SETTINGS, which is located under INTEGRATIONS in SETTINGS. Enter your Tenant ID (e.g. FELLOWPRO_DEV) and underneath that line you see the Upload file and the IMPORT Button, where you need to upload the previously exported SAML METADATA file.
Click on IMPORT and then choose the METADATA file that you have already downloaded from the SSO SERVICE PROVIDER SETTINGS
Click on CONFIGURE
Go to Admin settings
Click on ADD APPLICATION in the top right corner
Fill out all fields like on the following image but with your own SSO Url, don’t forget to choose an icon and click on SAVE.
Final Step
Log out of DocBits.
Go back to the burger menu in Infor and select the icon you just created.
And you will be taken to the Dashboard of DocBits.
Once you have uploaded your document and it is ready for validation, you will enter. the validation screen.
You will notice some fields are already extracted, this is due to DocBits swarm intelligence, but other fields still remain empty. This is due to this document not being correctly trained yet, meaning DocBits is unsure where to extract the information for those empty fields.
However, with your help, you can train DocBits to know where to search for this data on that specific type of document. Do keep in mind that some fields may be empty due to the required information not being present on the document you uploaded, but if the information is present on your document then it can be trained by doing the following.
Click on the field you would like to train and either double click on the information on the document or, using your cursor, create a block around the information to populate the field.
Repeat this until you have populated all the fields you require and then your document is ready to be exported, once exported, DocBits will remember this document type in future meaning that you will not need to manually train these fields again as DocBits learns where to extract this data.
If you’re using DocBits for document/table extraction, it’s important to save and delete rules properly to ensure your data is extracted accurately. This guide will walk you through the process.
Once you’ve trained a document in DocBits and defined all the rules for table extraction, you can save those rules for future imports. This means that the tables in your documents will be extracted automatically, without any manual training required.
To save your rules, simply click the “SAVE RULES” button. You’ll see a rule indicator that shows how many rules have been defined and saved for that document.
If you need to delete rules for a document, you can do so using the “DELETE RULES” button. This may be necessary if rules have been saved incorrectly or if you need to retrain a document.
However, it’s important to note that deleting rules will affect all documents with the same or similar format. This means that you’ll need to retrain those documents from scratch. So, be sure to use this feature with caution.
It is possible that on some documents text in rows is not only written under one column. It might happen that it is written through different columns like in the example below.
On the screenshot you can see that the table and columns have already been defined. Having a detailed look at the highlighted information (PRAEF) you will recognize that the text is written through columns “Bezeichnung”, “Menge”, “ME” and “Preis in EUR”.
In that case it is not possible for the system to automatically define to which column the information belongs.
To solve this issue DocBits offers a possibility to manually select and map information on a document to any column.
First of all make sure training mode is activated
In addition you need to activate the row edit mode
Please note that the manual mapping of text to a column is only possible for extractable columns (blue color).
The violet ones can not be mapped manually as the mapping has already been done via the columns defined on the document.
Mapping columns is a crucial step in accurate table extraction using DocBits. Once you’ve extracted data from a document, you can map the extracted columns with the columns given by DocBits. This ensures that the extracted data is correctly placed in the corresponding columns.
To map columns, simply select the header of a column in the extraction view. A dropdown menu will appear, allowing you to select the column you want to map. If you’ve mapped a column incorrectly, you can easily remap it by clicking the dropdown menu again.
Once a table has been extracted via DocBits and the columns have been mapped, the obtained data can be grouped to get a structured result set of all extracted data.
All documents from order confirmations to invoices can vary enormously in complexity from company to company. For example, in documents, information may be presented in tables in some columns across multiple rows and in other columns across only one row.
As an example, you can see the German invoice below, where the information in column “Bezeichnung” extends over several lines (positions).
At this point, another advantage of DocBits comes into play. It extracts the data in the first step 1 to 1. The result looks like this:
BUT now there is the possibility to group data based on a specific column. That means in this case it can be grouped by the column “Position” as shown in the following video. This in turn groups the rows of the “Description” column into one row. So that at the end you get a structured overall picture of the export and the data can now be processed further.
The result of grouping looks like this:
\
Bem-vindo ao guia de scripting do Docbits! Aqui, você aprenderá a usar scripts para automatizar e aprimorar o processamento de documentos dentro do Docbits. Scripts permitem a manipulação personalizada de campos, transformação de dados e implementação de lógica em vários tipos de documentos.
Os scripts no Docbits são escritos em Python. Eles interagem com campos de documentos e metadados para realizar uma ampla gama de operações, desde formatação simples de dados até lógica complexa.
Funções Principais
get_field_value(fields_dict, field_name, default=None)
: Recupera o valor de um campo especificado.
set_field_value(fields_dict, field_name, value)
: Define o valor de um campo especificado.
create_new_field(field_name, value)
: Cria um novo campo com um nome e valor especificados.
format_decimal_to_locale(value, locale)
: Formata um valor decimal de acordo com um local especificado.
Abaixo estão vários exemplos que demonstram tarefas comuns de scripting.
Exemplo 1: Mapeamento de Moeda para Faturas
Padronize símbolos ou texto de moeda para códigos de moeda ISO.
The "Calculating Total Charges" script automates the process of summing up various charges and additional amounts within invoice documents. This guide walks you through the script setup, logic, and application to ensure accurate total charge calculations across your documents.
This script aims to provide a dynamic way to calculate the total charges on an invoice by adding up different charge types, such as base charges, freight (Fracht), and packaging (Verpackung). It then updates the invoice's total charges field with the calculated sum, ensuring accurate billing information.
The script retrieves values from specified fields, converts them to floats, sums them up, and then updates the total_charges
field with the result. If the total_charges
field doesn't exist, the script creates this field and sets its value accordingly.
This section of the script processes an invoice table to remove any lines where both the quantity and the total amount are zero or not provided.
Check for INVOICE_TABLE: It starts by checking if the INVOICE_TABLE
key exists in the tables_dict
dictionary.
Iterate Over Rows: For each row in the table, the script initializes flags and variables to determine if the TOTAL_AMOUNT
and QUANTITY
columns exist and to capture their values.
Check Column Names: As it iterates through each column in a row, it looks for columns named TOTAL_AMOUNT
and QUANTITY
.
If TOTAL_AMOUNT
is found, it captures the value. If this value is non-zero, it converts it to a float.
Similarly for QUANTITY
, capturing and converting the value if it is non-zero.
Mark Row for Deletion: After checking both columns in a row, if both the total amount and quantity are effectively zero (either by being zero or not existing), the row is marked for deletion by setting row['is_deleted']
to True
.
This section calculates the total amount from all lines in an invoice and compares it to the invoice’s reported total to validate their consistency.
Initialize Line Total: Starts by setting a variable lines_total
to 0.0 to accumulate the total amount from all lines.
Sum Line Amounts: Iterates over each row in the INVOICE_TABLE
, extracting the TOTAL_AMOUNT
from each and adding it to lines_total
.
Retrieve and Convert Invoice Total: Fetches the total invoice amount using a helper function get_field_value
and converts it to a float.
Compare Totals: Finally, it checks if the absolute difference between the calculated line total (lines_total
) and the reported invoice total (total_amount
) exceeds a threshold of 0.05. If so, it marks the invoice total field as invalid using another helper function set_field_as_invalid
, citing a mismatch.
The script effectively ensures data integrity by:
Removing data rows that do not contribute to the invoice’s financial total due to lacking quantities or amounts.
Validating the consistency between the sum of individual line amounts and the overall invoice total, highlighting discrepancies for further action.
This automation helps maintain accurate financial records and can be crucial for systems like ERP that require precise data for accounting and financial reporting.
This document details the "Generating Extended Invoice Numbers" script, which automates the creation of extended invoice numbers in Docbits. Extended invoice numbers combine multiple document identifiers, such as the invoice ID and the purchase order number, into a single, comprehensive identifier. This script enhances document traceability and simplifies record-keeping.
The purpose of this script is to streamline the process of generating extended invoice numbers by automatically concatenating the invoice ID and purchase order number, thereby providing a unified and unique identifier for each invoice document.
The script checks for the presence of invoice ID and purchase order number fields within the document, concatenates their values if both are present (separated by a hyphen), and updates or creates a new field to store the combined value.
This document provides a detailed guide on the "Calculating Total Charges" script within the Docbits platform. The script is designed to automatically calculate the total amount charged on an invoice by summing up various individual charges. This automation enhances accuracy and efficiency in document processing.
The aim of this script is to streamline the calculation process for total charges on invoices. By automatically adding up specified charges, such as base charges, taxes, and additional fees, the script ensures that the total charges reflected on each invoice are accurate and comprehensive.
This guide focuses on automating the creation of extended invoice numbers in Docbits, a crucial feature for improving invoice management and tracking. The "Generating Extended Invoice Numbers" script concatenates various document identifiers, such as invoice ID and purchase order number, to create a comprehensive and unique identifier for each invoice.
The primary goal of this script is to automate the generation of extended invoice numbers, facilitating easier tracking and management of invoices by combining multiple identifiers into a single, unique number.
This document outlines the "Formatting Export Certificate Reference Numbers" script, aimed at standardizing reference numbers across export certificates in Docbits. Proper formatting ensures that reference numbers comply with external systems or regulatory requirements.
The script's primary goal is to format reference numbers on export certificates, ensuring they meet a predefined length requirement by padding them with leading zeros. This consistency aids in maintaining a standardized format for all export documents processed through Docbits.
The script identifies the reference_number
field in an export certificate, checks its length, and if necessary, pads the number with leading zeros to ensure it meets the minimum length requirement.
Field | Value |
---|---|
Application Type
DEFAULT_SAML
Display Name
DocBits
Entity ID
See Entity ID under SSO SERVICE SETTINGS
SSO Endpoint
Copy the SSO URL from SSO SERVICE SETTINGS and paste it in the SSO Endpoint field.
Signing Certificate
Upload the appropriate .cer file you have downloaded in step 3c) from SSO SERVICE SETTINGS
Name ID Format and Mapping
email address
Login Details to Cloud
Credentials are mandatory for accessing the Infor Cloud environment. The user should have the roles "Infor-SystemAdministrator" and "UserAdmin".
Config Admin Details (DocBits)
You should have received an email from FellowPro AG with the login details for the DocBits SSO Settings page. You will need a login and password.
Certificate
You can download the certificate in DocBits under SSO Service Provider Settings
When it comes to testing your PO Matching configuration, you will need to create a Purchase order in LN/M3 in order to check whether INFOR is synced with DocBits.
LN: https://docs.infor.com/ln/10.4/en-us/lnolh/docs/ln_10.4_procpoug__en-us.pdf
M3: https://docs.infor.com/m3udi/16.x/en-us/m3beud/default.html?helpcontent=ois610.html
Once you have created your purchase order, go to Settings → Master Data Lookup and search for the purchase order number of the PO you just created as it should now appear in your purchase order master data in DocBits.
You should see your unique PO number here, this means that DocBits and INFOR are correctly synced.
Now upload your invoice that matches the quantity and unit prices of the purchase order you created. Validate the document and select PO Matching on the validation screen.
The PO and invoice line items should automatically match, then simply select the export option and check whether or not the document gets exported without any errors. If you do encounter an export error, create a ticket for the DocBits support team to assist you. If you are unsure of how to create a ticket within DocBits, please consult our DocBits Overview documentation for assistance.
\
No Painel, selecione Configurações.
Processamento de Documentos → Módulo
Em seguida, ative o Portal do Fornecedor ativando o controle deslizante.
Uma vez que o recurso do Portal do Fornecedor foi ativado, uma nova área de Configurações se torna disponível. Se você rolar até o final da lista de Configurações, verá o seguinte.
Nesta seção, você tem acesso ao seguinte.
Aqui é onde você pode gerenciar todos os fornecedores agrupando-os, facilitando a gestão de todos os fornecedores. Os usuários também podem ser atribuídos aos vários grupos de fornecedores que você criar nesta área.
Para facilitar a gestão de todos os fornecedores, nesta seção, você pode criar grupos e atribuir fornecedores a esses grupos com base em sua localização geográfica, tipo de fornecedor, etc. A escolha é sua!
Para criar um novo grupo de fornecedores, clique no botão +Novo no canto superior direito da sua tela.
Em seguida, simplesmente atribua a este grupo um nome que descreva com precisão os fornecedores que serão atribuídos a este grupo.
Todos os grupos de fornecedores que você criou serão mostrados aqui, pois é onde você pode atribuir usuários do DocBits aos grupos de fornecedores que você criar.
Para atribuir um novo usuário a um grupo de fornecedores, clique no botão +Novo e selecione o usuário que você gostaria de atribuir.
Selecione o usuário que você gostaria de atribuir ao grupo de fornecedores na lista suspensa e, em seguida, clique em Adicionar para atribuir esse usuário.
Aqui é onde você pode fazer o upload da sua política e declaração de privacidade para os fornecedores que você convidará através do portal do fornecedor. Para fazer o upload de um documento, basta clicar em Fazer Upload do Documento no canto superior direito da sua tela.
Nesta seção, você pode fazer o upload de vários modelos de email, pois estes serão a estrutura dos emails recebidos pelo fornecedor quando você o convidar para se juntar ao portal do fornecedor. Os modelos necessários incluem: convite, aprovação, conclusão de registro e rejeição, pois estes são todos os possíveis emails que um fornecedor pode receber. Para fazer o upload de um modelo, clique em +Novo.
Você pode personalizar o layout do formulário de registro que os fornecedores que você convidar verão ao se registrarem para o portal do fornecedor. Isso significa que você pode adicionar ou remover campos dependendo do que é necessário dos fornecedores que você convidar. O construtor de layout do fornecedor funciona exatamente como o construtor de layout do tipo de documento; para mais informações sobre isso, clique aqui.
Abaixo, você pode ver que é possível configurar vários layouts dependendo das suas necessidades.
Dentro do layout, você pode adicionar listas suspensas que podem ser criadas usando o recurso Lista de Valores.
Apenas valores que existem no INFOR podem ser usados nessas listas para que a exportação funcione. No exemplo mostrado acima, os valores selecionáveis para o campo “Grupo de Fornecedores” existem todos dentro do INFOR. O mesmo princípio se aplica a todas as listas, seja para condições de pagamento, moedas, etc. Para garantir que não haja problemas durante a exportação, consulte primeiro os valores armazenados no INFOR antes de configurar essas listas.
Aqui é onde você pode fazer o upload do logotipo da sua empresa ou alternativa, que será exibido no cabeçalho de todos os modelos de email enviados, bem como na tela de login do DocBits assim que os fornecedores começarem o processo de registro. Se deixado em branco, o logotipo padrão do DocBits será exibido.
Campos Adicionais de Convite do Portal do Fornecedor: Esta opção permite que você adicione campos adicionais ao convite que você envia aos fornecedores; uma lista de campos é disponibilizada para você, conforme mostrado abaixo.
Faça o upload da sua API ION, bem como do seu arquivo de mapeamento IDM aqui, e seu arquivo de mapeamento M3 será gerado à direita.
Once you have configured auto accounting for your DocBits environment, it is very important that all functionalities are tested to ensure a smooth hand-over process.
From the Dashboard, navigate to Settings → Document Processing → Module.
Under the Purchase Order / Auto Accounting tab you will find the slider to enable the feature as well as a drop down box to select LN or M3
After you have uploaded your document and entered the validation screen, select the following icon to enter auto accounting.
You will then be taken to this screen, this will only occur if the table has been properly trained in order to distinguish between the various line items correctly.
From here, you have two choices:
Split the invoice using the total amount
Split the invoice via each individual line item
This selection is done by clicking on which option you prefer
In order for the Auto Accounting feature to function, certain data and information must be configured. To assist in this process or to let you know what configuration you are missing, we have created a “Validate Setup” button which is located as shown below.
Once pressed, DocBits will run a check of your environment to see if everything is correctly configured.
A menu that will look as follows will appear
All items with the green check mark in front of them mean that they are configured and working, the above image should only be used as an example and should not be followed item by item for your environment. This is due to each user having their own set of accounts, dimensions, and dimension options as categorized above.
Splitting Amounts
This is done by clicking on the following icon next to the total amount or specific line item
Once you select the splitting icon, a new menu will appear underneath the selected item
Accounts
The first block of the new menu gives you access to a dropdown list where you can select an account/department from your pre configured accounts/departments which you would like to split the amount between. DocBits will group similar account types together, making it easier to find certain types of accounts or accounts related to one another.
Amounts
The second block allows you to manually type in the respective amounts for which the parent amount will be split into.
Percentage Split
The third and final block allows you to manually enter the percentage split you would like the parent amount to be split into, the amount is automatically set to 50% when entering the splitting menu but can be changed to suit your preferences. If you enter a percentage value in the block, the amount will automatically be calculated in the second block.
Keep in mind that the new amount will only be calculated once you press the ‘Enter key’ on your keyboard and that the other percentage(s) are not altered by this change, this can result in amounts that do not add up to the parent amount. For this reason, we have created an “Unsettled amount” counter in this menu which is discussed later.
Other Features
Add Row
A plus icon next to the splitting icon, this can be used in situations where an amount needs to be split between more than two accounts/departments.
Delete Row
A trash can icon that is used to delete unwanted or incorrect rows
Unsettled Amount
An “Unsettled amount” indicator at the bottom in case of any unaccounted value or amount outstanding from the parent amount.
Once you have validated and exported a document, sign in to INFOR. From the Home page of LN, navigate to Financials → Process Payables → Process Payables Workbench.
Press the following icon to view the individual line items of the invoice.
Here you can confirm the line items with what you exported using auto accounting.
Once you have validated and exported a document, sign in to INFOR and type in the APS450 command.
You will be taken to the following page where you will see the document you have just exported.
Once you find your document, right-click on the document and select “Related” and then the “Lines” option.
You will be taken to the menu that displays all the line items from the invoice you exported, ensuring all the line items are there and are correct. Lastly ensure that all these line items are classified as Type 8, this can be seen by looking at the following column in the table.