Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Menedżer układu umożliwia administratorom wizualną konfigurację i modyfikację układu typów dokumentów poprzez ustawianie właściwości dla różnych pól danych i grup w dokumencie. Ten interfejs pomaga zapewnić, że modele ekstrakcji i punkty ręcznego wprowadzania danych są dokładnie zgodne z strukturą dokumentu zeskanowanego lub przesłanego do DocBits.
Grupy i pola:
Grupy: Jednostki organizacyjne w obrębie typu dokumentu, które kategoryzują powiązane pola (np. Szczegóły faktury, Szczegóły płatności). Mogą być rozszerzane lub zwijane oraz ustawiane tak, aby odzwierciedlać logiczne grupowanie w rzeczywistym dokumencie.
Pola: Indywidualne punkty danych w każdej grupie (np. Numer faktury, Warunki płatności). Każde pole może być dostosowane pod kątem sposobu przechwytywania, wyświetlania i przetwarzania danych.
Panel właściwości:
Ten panel wyświetla właściwości wybranego pola lub grupy, umożliwiając szczegółową konfigurację, taką jak:
Etykieta: Widoczna etykieta pola w interfejsie użytkownika.
Nazwa pola: Techniczny identyfikator używany w systemie.
Szerokość elementu w procentach: Określa szerokość pola w stosunku do układu dokumentu.
Indeks zakładki: Kontroluje kolejność przemieszczania się za pomocą klawiszy TAB.
Uruchom skrypt po zmianie: Czy ma zostać wykonany skrypt po zmianie wartości pola.
Wyświetl etykietę po lewej: Czy etykieta ma być wyświetlana po lewej stronie pola czy nad nim.
Jest polem tekstowym: Określa, czy pole powinno być polem tekstowym, dostosowującym się do większych ilości tekstu.
Wybierz typ modelu: Opcja wyboru, który typ modelu będzie obsługiwał ekstrakcję tego pola.
Długość pola: Maksymalna długość danych do zaakceptowania w tym polu.
Zabronione słowa kluczowe: Słowa kluczowe niedozwolone w polu.
Podgląd szablonu:
Pokazuje podgląd w czasie rzeczywistym, jak będzie wyglądał dokument na podstawie bieżącej konfiguracji układu. Pomaga to w zapewnieniu, że układ odpowiada rzeczywistej strukturze dokumentu i jest kluczowy do testowania i udoskonalania konfiguracji przetwarzania dokumentów.
Typy Dokumentów to w zasadzie specjalizowane wersje głównych typów dokumentów. Na przykład, pod głównym typem dokumentu "Faktura" mogą istnieć podtypy takie jak "Standardowa Faktura," "Faktura Pro-forma," i "Faktura Kredytowa," każda z nieco innymi wymaganiami dotyczącymi danych lub regułami przetwarzania.
Lista Podtypów:
Każdy wiersz reprezentuje podtyp głównego typu dokumentu.
Zawiera nazwę podtypu oraz zestaw działań, które można na nim wykonać.
Działania:
Pola: Konfiguruj, które pola danych są zawarte w podtypie i jak są nimi zarządzane.
Edytuj Układ: Modyfikuj wizualny układ, w jaki sposób informacje są wyświetlane i wprowadzane dla tego podtypu.
Skrypty: Dołącz lub edytuj skrypty, które wykonują określone operacje podczas przetwarzania dokumentów tego podtypu.
Kopiuj: Duplikuj istniejącą konfigurację podtypu do użycia jako podstawa dla nowej.
Usuń: Usuń podtyp, jeśli nie jest już potrzebny.
Dodawanie Nowych Podtypów:
Przycisk "+ Nowy" pozwala administratorom tworzyć nowe podtypy, definiując unikalne właściwości i reguły według potrzeb.
Typy Dokumentów w DocBits zapewniają elastyczny i potężny sposób obsługi różnorodnych dokumentów w ramach jednego systemu, zapewniając dokładne i efektywne przetwarzanie każdego typu i podtypu zgodnie z jego unikalnymi specyfikacjami.
Interfejs Kolumn Tabel w Docbits służy do określenia kolumn, które pojawiają się w tabelach danych dla każdego typu dokumentu. Każda kolumna może być skonfigurowana do przechowywania określonych typów danych, takich jak ciągi znaków lub wartości liczbowe, i może być istotna dla funkcji sortowania, filtrowania i raportowania w Docbits.
Konfiguracja kolumn:
Nazwa kolumny: Identyfikator kolumny w bazie danych.
Tytuł: Czytelny dla człowieka tytuł kolumny, który pojawi się w interfejsie.
Typ kolumny: Określa typ danych kolumny (np. CIĄG ZNAKÓW, KWOTA), co determinuje, jakiego rodzaju dane mogą być przechowywane w kolumnie.
Nazwa tabeli: Wskazuje, do której tabeli należy kolumna, łącząc ją z określonym typem dokumentu, takim jak TABELA_FAKTUR.
Działania:
Edytuj: Modyfikuj ustawienia istniejącej kolumny.
Usuń: Usuń kolumnę z tabeli, co jest przydatne, jeśli dane nie są już wymagane lub jeśli struktura danych typu dokumentu ulega zmianie.
Dodawanie nowych kolumn i tabel:
Dodaj nową kolumnę tabeli: Otwiera okno dialogowe, w którym można zdefiniować nową kolumnę, w tym jej nazwę, czy jest wymagana, jej typ danych i tabelę, do której należy.
Utwórz nową tabelę: Pozwala na utworzenie nowej tabeli, definiując unikalną nazwę, która będzie używana do przechowywania danych związanych z określonym zestawem typów dokumentów.
Ta sekcja jest kluczowa dla utrzymania integralności strukturalnej i użyteczności danych w systemie Docbits, zapewniając, że dane wyodrębnione z dokumentów są przechowywane w dobrze zorganizowany i dostępny sposób.
Ustawienie Pola zapewnia interfejs użytkownika, w którym administratorzy mogą zarządzać właściwościami i zachowaniem poszczególnych pól danych powiązanych z typem dokumentu. Każde pole można dostosować, aby zoptymalizować dokładność i wydajność przechwytywania danych oraz walidacji.
Konfiguracja pól:
Nazwy pól: Wyświetla nazwy pól, zazwyczaj odpowiadające elementom danych w dokumencie, takie jak "Numer faktury" lub "Data zamówienia".
Wymagane: Administratorzy mogą oznaczyć pola jako wymagane, zapewniając, że dane muszą zostać wprowadzone lub przechwycone, aby ukończyć przetwarzanie dokumentu.
Tylko do odczytu: Pola można ustawić jako tylko do odczytu, aby zapobiec modyfikacji po przechwyceniu danych lub podczas określonych etapów przetwarzania dokumentu.
Ukryte: Pola można ukryć w interfejsie użytkownika, co jest przydatne dla poufnych informacji lub w celu uproszczenia procesów użytkownika.
Zaawansowane ustawienia:
Wymuszenie walidacji: Zapewnia, że dane wprowadzone do pola przechodzą określone reguły walidacji przed zaakceptowaniem.
OCR (Optyczne rozpoznawanie znaków): Ten przełącznik można włączyć, aby umożliwić przetwarzanie OCR dla określonego pola, co jest przydatne do automatycznego wyodrębniania danych zeskanowanych lub cyfrowych dokumentów.
Wynik dopasowania: Administratorzy mogą zdefiniować wynik dopasowania, czyli próg używany do określenia poziomu pewności rozpoznawania lub dopasowania danych, wpływający na sposób wykonywania walidacji danych i kontroli jakości.
Przyciski akcji:
Utwórz nowe pole: Umożliwia dodanie nowych pól do typu dokumentu.
Ikony edycji: Każde pole ma ikonę edycji, która umożliwia administratorom dalszą konfigurację ustawień specyficznych dla pola, takich jak typ danych, wartości domyślne lub powiązana logika biznesowa.
Zapisz ustawienia: Zatwierdza dokonane zmiany w konfiguracjach pola.
Szkolenie modelu umożliwia administratorom nadzorowanie i zarządzanie szkoleniem modeli uczenia maszynowego specyficznych dla każdego typu dokumentu. Dzięki zapewnieniu strukturalnego interfejsu do importowania danych próbkowych, szkolenia modeli i testowania ich wydajności, Docbits zapewnia, że zdolności ekstrakcji danych ciągle się poprawiają.
Przegląd metryk:
Próbka: Liczba dokumentów próbkowych użytych do szkolenia.
Eksportowane: Liczba dokumentów, które zostały pomyślnie wyeksportowane po przetworzeniu.
Suma firmy: Łączna liczba dokumentów specyficznych dla firmy przetworzonych.
Suma ogólna: Łączna liczba dokumentów przetworzonych we wszystkich kategoriach.
Opcje szkolenia i testowania:
Importuj: Pozwala administratorom importować nowe zbiory danych szkoleniowych, które zazwyczaj są strukturalnymi próbkami dokumentów, które powinny być rozpoznawane przez system.
Trenuj model: Rozpoczyna proces szkolenia, używając importowanych danych do poprawy zdolności rozpoznawania i ekstrakcji systemu.
Test klasyfikacji: Umożliwia testowanie modelu w celu oceny jego wydajności w klasyfikowaniu i ekstrahowaniu danych z nowych lub niewidzianych dokumentów.
Przyciski akcji:
Utwórz pole: Dodaj nowe pola danych, które model powinien rozpoznawać i ekstrahować.
Akcje: To rozwijane menu może zawierać opcje takie jak wyświetlanie szczegółów, edytowanie konfiguracji lub usuwanie danych szkoleniowych.
Sekcja Typy Dokumentów wymienia wszystkie typy dokumentów rozpoznawanych i przetwarzanych przez Docbits. Administratorzy mogą zarządzać różnymi aspektami takimi jak układ, definicje pól, reguły ekstrakcji i inne dla każdego rodzaju dokumentu. Ta dostosowalność jest kluczowa dla zapewnienia dokładnej obsługi danych i zgodności z normami organizacyjnymi.
Lista Typów Dokumentów:
Każdy wiersz reprezentuje typ dokumentu, takie jak Faktura, Nota Kredytowa, Nota Dostawy, itp.
Typy dokumentów mogą być standardowe lub niestandardowe, co jest wskazane przez etykiety takie jak "Standardowy."
Edytuj układ: Ta opcja pozwala administratorom modyfikować ustawienia układu dokumentu, które obejmują określenie wyglądu dokumentu i miejsca, w którym znajdują się pola danych.
Podtypy dokumentów: Jeśli jakiekolwiek typy dokumentów mają podkategorie, ta opcja pozwala administratorom skonfigurować ustawienia specyficzne dla każdego podtypu.
Kolumny tabeli: Dostosuj, które kolumny danych powinny pojawić się, gdy typ dokumentu jest wyświetlany na listach lub raportach.
Pola: Zarządzaj polami danych powiązanymi z typem dokumentu, w tym dodawanie nowych pól lub modyfikowanie istniejących.
Szkolenie modelu: Skonfiguruj i trenuj model używany do rozpoznawania i ekstrakcji danych z dokumentów. Może to obejmować ustawianie parametrów modeli uczenia maszynowego, które poprawiają się z czasem dzięki większej ilości danych.
Wyrażenia regularne (Regex): Skonfiguruj wyrażenia regularne, które są używane do ekstrakcji danych z dokumentów na podstawie wzorców. Jest to szczególnie przydatne do ekstrakcji danych strukturalnych.
Skrypty: Napisz lub zmodyfikuj skrypty, które uruchamiają niestandardowe reguły przetwarzania lub przepływy pracy dla dokumentów tego typu.
EDI (Elektroniczna Wymiana Danych): Skonfiguruj ustawienia związane z wymianą dokumentów w standardowych formatach elektronicznych.
Zobacz
Skrypty w Docbits są zazwyczaj pisane w języku skryptowym obsługiwanym przez system Python. Są one uruchamiane podczas przetwarzania dokumentu, aby zastosować złożoną logikę biznesową lub zapewnić integralność i dokładność danych przed dalszym przetwarzaniem lub przechowywaniem.
Zarządzanie skryptem:
Nazwa: Każdy skrypt otrzymuje unikalną nazwę w celu identyfikacji.
Typ dokumentu: Skojarza skrypt z określonym typem dokumentu, określając, do których dokumentów skrypt zostanie zastosowany.
Wyzwalacz: Określa, kiedy skrypt jest uruchamiany (np. podczas przesyłania dokumentu, przed eksportem danych, po walidacji danych).
Status Aktywny/Nieaktywny: Umożliwia administratorom aktywowanie lub dezaktywowanie skryptów bez ich usuwania, zapewniając elastyczność w testowaniu i wdrożeniu.
Edytor skryptów:
Zapewnia interfejs, w którym skrypty mogą być pisane i edytowane. Edytor zazwyczaj obsługuje podświetlanie składni, podświetlanie błędów i inne funkcje wspomagające rozwój skryptu.
Przykładowy skrypt: Skrypty mogą zawierać operacje takie jak iterowanie przez linie faktur w celu zweryfikowania sum lub usunięcia wpisów, które nie spełniają określonych kryteriów.
W DocBits ustawienia Regex pozwalają administratorom zdefiniować niestandardowe wzorce, które system używa do znajdowania i wyodrębniania danych z dokumentów. Ta funkcja jest szczególnie przydatna w sytuacjach, gdy dane muszą być wyodrębnione z nieustrukturyzowanego tekstu lub gdy dane podlegają przewidywalnemu formatowi, który można uchwycić za pomocą wzorców regex.
Zarządzanie wyrażeniami regularnymi:
Dodaj: Pozwala utworzyć nowy wzorzec regex dla określonego typu dokumentu.
Zapisz zmiany: Zapisuje modyfikacje w istniejących konfiguracjach regex.
Wzorzec: Tutaj możesz zdefiniować wzorzec regex pasujący do określonego formatu danych wymaganego.
Pochodzenie: Jest to pochodzenie dokumentu - Na przykład można zdefiniować inny Regex w Niemczech
W DocBits ustawienia EDI zapewniają narzędzia do definiowania i zarządzania strukturą oraz formatem komunikatów EDI odpowiadających różnym typom dokumentów, takim jak faktury czy zamówienia zakupu. Ustawienia umożliwiają dostosowanie komunikatów EDI do standardów i wymagań określonych dla różnych partnerów handlowych i branż.
Elementy konfiguracji EDI:
Deskryptor struktury: Definiuje podstawową strukturę dokumentu EDI, w tym kolejność segmentów, pola obowiązkowe i kwalifikatory niezbędne do poprawności dokumentu EDI.
Transformacja: Określa transformacje stosowane do konwersji danych dokumentu na sformatowany komunikat EDI. Zazwyczaj obejmuje określenie odwzorowań pól dokumentu na segmenty i elementy EDI.
Podgląd: Umożliwia administratorom zobaczenie, jak będzie wyglądał komunikat EDI po transformacji, co pomaga zapewnić dokładność przed transmisją.
Ścieżki ekstrakcji: Pokazuje ścieżki używane do wyodrębnienia wartości z dokumentu, które są następnie używane do wypełnienia komunikatu EDI.
Edytor XSLT:
Służy do edycji i walidacji XSLT (eXtensible Stylesheet Language Transformations) używanych w procesie transformacji. XSLT to potężny język zaprojektowany do transformacji dokumentów XML na inne dokumenty XML lub inne formaty, takie jak HTML, tekst czy nawet inne struktury XML.
Navigate to SETTINGS → Dashboard
Customize your filter
Navigate to the Filters section
To create a Custom Filter, select CUSTOM in the “Status filter style”
Underneath that, you can select the various statuses a document can be in to create your custom filter. Once you press APPLY, this custom filter will be active on the Dashboard.
Dashboard
Select ADVANCED SETTINGS
Custom Filters can then be accessed by selecting the MORE SETTINGS option.
W ustawieniu "Wygaśnięcie dokumentów" w sekcji "PRZETWARZANIE DOKUMENTÓW" możesz skonfigurować harmonogramy automatycznego usuwania dokumentów z systemu. Ta funkcja jest szczególnie przydatna do zarządzania politykami retencji danych i zapewnienia zgodności z prawem lub standardami organizacyjnymi dotyczącymi przechowywania dokumentów.
Oto co robi każde ustawienie:
Usuń dokument po: To rozwijane menu pozwala ustawić ramy czasowe, po których dokument przetworzony zostanie automatycznie usunięty z systemu. Możesz wybrać spośród predefiniowanych opcji, takich jak 48 godzin, 1 tydzień, 2 tygodnie lub 4 tygodnie. Pomaga to w zarządzaniu przestrzenią i utrzymaniu polityk cyklu życia dokumentu.
Usuń ukończony dokument po: Podobnie jak poprzednie ustawienie, ale skierowane specjalnie do ukończonych lub w pełni przetworzonych dokumentów. Pozwala na te same opcje ram czasowych dla usuwania, zapewniając, że ukończone dokumenty nie są przechowywane dłużej niż konieczne.
Oba ustawienia zapewniają, że dokumenty nie są przechowywane na stałe w systemie, pomagając uniknąć niepotrzebnego zużycia miejsca i utrzymując porządek w przepływie pracy z dokumentami. Te ustawienia są istotne dla organizacji, które muszą przestrzegać określonych wymagań regulacyjnych dotyczących retencji dokumentów.
Ustawienia "Importu" w systemie przetwarzania dokumentów pozwalają skonfigurować sposób importowania dokumentów do systemu z różnych źródeł, w tym ustawienia dla zarówno FTP (File Transfer Protocol), jak i poczty e-mail. Oto szczegółowy opis tych ustawień:
Ustawienia dokumentu:
Ogranicz do stron: Pozwala ograniczyć przetwarzanie do określonej liczby stron z każdego dokumentu.
Terminy płatności w dniach: Określa domyślne warunki płatności (w dniach), które można zastosować do dokumentów.
Wzorzec daty: Ustawia wzorzec rozpoznawania i formatowania dat w importowanych dokumentach.
Import FTP:
Typ: Określa typ protokołu FTP do użycia (np. SFTP).
Adres serwera: Adres serwera, z którego będą pobierane dokumenty.
Nazwa użytkownika i port: Dane uwierzytelniające i numer portu do dostępu do serwera FTP.
Domyślny katalog: Określa katalog na serwerze FTP, z którego będą importowane pliki.
Dodatkowa opcja dodawania nowych danych obejmuje pola takie jak:
Hasło: Do uwierzytelniania.
Wzorce dopasowania nazw plików: Określenie, które pliki mają być importowane na podstawie ich nazw.
Podorganizacje: Wybierz, do której podorganizacji mają być stosowane ustawienia importu.
Import e-mail:
Adres e-mail: Skonfiguruj konto e-mail, na które można wysyłać dokumenty do importu.
Nazwa użytkownika i hasło: Dane uwierzytelniające do dostępu do konta e-mail.
Protokół i szyfrowanie: Określ protokół (IMAP, POP3) i typ szyfrowania.
Scal załączone dokumenty: Opcja łączenia wszystkich załączonych dokumentów w pojedynczy dokument podczas importu.
Pozwala to również określić podorganizacje, aby uzyskać bardziej precyzyjną kontrolę nad kierowaniem e-maili w strukturze organizacyjnej.
Ustawienia "Klasyfikacji i Ekstrakcji" w systemie przetwarzania dokumentów umożliwiają konfigurowanie sposobu automatycznego przetwarzania dokumentów po ich wprowadzeniu:
Dzielenie Dokumentów: Ta opcja pozwala wybrać, czy dzielić dokumenty na podstawie określonych kryteriów, czy też zachować je jako pojedynczy dokument. Może to być przydatne podczas przetwarzania dokumentów, które posiadają wiele odrębnych sekcji, ale są przesłane jako jeden plik.
Formatowanie Kwot: Tutaj można włączyć opcje takie jak zaokrąglanie łącznych kwot w zamówieniach. Zapewnia to, że wyekstrahowane dane są zgodne z oczekiwanymi formatami finansowymi i regułami, zmniejszając błędy w przetwarzaniu finansowym.
Ekstrakcja Tabel: Ta funkcja umożliwia zautomatyzowaną ekstrakcję danych z tabel wewnątrz dokumentów. Można określić, czy wydobywać wszystkie tabele, tylko te istotne dla celów kosztowych, czy automatycznie identyfikować i wydobywać kody podatkowe. To znacznie zwiększa dokładność i efektywność ekstrakcji danych z formatów strukturalnych wewnątrz dokumentów.
Konfiguracja Reguły Klasyfikacji: Ta opcja pozwala zdefiniować konkretne wzorce i kryteria, które pomagają systemowi automatycznie klasyfikować i kategoryzować dokumenty podczas przetwarzania. Może to być oparte na wzorcach tekstowych, typach dokumentów lub innych metadanych powiązanych z dokumentami.
"Wyszukiwanie danych głównych" w ustawieniach przetwarzania dokumentów umożliwia kompleksowe podejście do zarządzania i walidacji danych dokumentów poprzez synchronizację ich z systemem ERP Infor. Oto w jaki sposób pomaga usprawnić walidację i ulepszyć przetwarzanie dokumentów w środowisku ERP:
Zcentralizowane zarządzanie danymi: Ta funkcja działa jako centralne repozytorium, w którym można przechowywać i zarządzać danymi z różnych źródeł, takimi jak dostawcy, adresy klientów, kody podatkowe i inne. Zapewnia ona pojedynczy punkt odniesienia dla wszystkich danych głównych, zapewniając spójność i dokładność w całej organizacji.
Walidacja na podstawie danych ERP: Poprzez synchronizację danych głównych, takich jak informacje o dostawcach z Infor do Docbits, można automatycznie walidować dane wyodrębnione z dokumentów na podstawie danych ERP. Zapewnia to, że przetwarzane informacje (takie jak nazwy dostawców, adresy i kody podatkowe) zgadzają się z danymi przechowywanymi w systemie ERP, minimalizując błędy i rozbieżności.
Ułatwia automatyzację: Posiadanie solidnego systemu wyszukiwania danych głównych pomaga w automatyzacji przetwarzania przychodzących dokumentów. Na przykład zamówienia zakupu lub faktury mogą być automatycznie sprawdzane pod kątem poprawności danych dostawcy, zatwierdzane, jeśli się zgadzają, lub oznaczane do przeglądu, jeśli zostaną znalezione rozbieżności.
Ulepsza integralność danych: Regularne aktualizacje z systemu ERP do wyszukiwania danych głównych zapewniają, że dane używane do przetwarzania dokumentów są aktualne. Zmniejsza to ryzyko przetwarzania dokumentów na podstawie przestarzałych informacji, poprawiając ogólną integralność transakcji biznesowych.
Efektywność w przetwarzaniu dokumentów: Dzięki bezpośrednim powiązaniom danych głównych i ich ciągłemu odświeżaniu, przetwarzanie dokumentów staje się bardziej efektywne. Dokumenty mogą być automatycznie klasyfikowane i kierowane na podstawie określonych kryteriów ustawionych w danych głównych, takich jak określone warunki dostawcy lub przepisy podatkowe dotyczące różnych rodzajów transakcji.
Zobacz tutaj, jak importować dane główne
W sekcji "Ustawienia eksportu" w ustawieniach przetwarzania dokumentów zarządzasz tym, jak dokumenty są eksportowane po przetworzeniu. Obejmuje to konfigurowanie różnych metod eksportu dostosowanych do konkretnych potrzeb, takich jak bezpośrednie transfery plików lub integracja z systemami firmowymi, takimi jak Infor. Oto rozbudowa opcji i funkcjonalności w tym ustawieniu:
Wybór Metody Eksportu: Możesz wybrać, w jaki sposób dokumenty powinny być eksportowane. Powszechne metody obejmują SFTP, webhooks oraz bezpośrednią integrację z systemami takimi jak Infor IDM (Infor Document Management), Infor ION (Intelligent Open Network) i inne opcje związane z Infor. Każda metoda obsługuje różne rodzaje przepływów pracy i wymagania integracyjne.
Szczegóły Konfiguracji:
Dla SFTP zazwyczaj musisz określić adres URL serwera, nazwę użytkownika, hasło oraz folder docelowy, do którego dokumenty zostaną przesłane.
W przypadku integracji z Infor może być konieczne skonfigurowanie konkretnych odwzorowań lub podanie kluczy API, aby zapewnić poprawne przetwarzanie dokumentów i ich wysyłkę do ekosystemu Infor, zgodnie z istniejącą strukturą danych i przepływami.
Dostosowanie: W zależności od typu dokumentu (np. faktury, potwierdzenia zamówienia) i podorganizacji, można skonfigurować różne ustawienia eksportu, aby spełnić zróżnicowane reguły biznesowe lub wymagania zgodności.
Elastyczność Integracji: Interfejs pozwala na aktywowanie wielu konfiguracji integracyjnych jednocześnie, umożliwiając użytkownikom efektywne zarządzanie różnymi typami dokumentów i miejscami docelowymi. Ten modularny układ zapewnia, że różne części firmy mogą korzystać z dostosowanych przepływów pracy bez wpływania na siebie nawzajem.
Dodawanie i Edytowanie Konfiguracji: Użytkownicy mogą dodawać nowe konfiguracje lub edytować istniejące, określając szczegóły takie jak typ dokumentu, metoda eksportu oraz dane uwierzytelniające lub ustawienia specyficzne dla wybranej metody eksportu.
\
Zobacz Eksportowanie w DocBits
Automatyzacja księgowości: DocBits oferuje zautomatyzowaną funkcjonalność księgową, usprawniając proces obsługi transakcji finansowych związanych z zamówieniami zakupu (PO) i innymi dokumentami.
Integracja z M3: Integracja z M3, rozwiązaniem oprogramowania ERP, umożliwiająca płynną interakcję między DocBits a M3 w celu usprawnienia zarządzania dokumentami i ich przetwarzania.
PO Dashboard: Zcentralizowany pulpit w DocBits specjalnie zaprojektowany do zarządzania i monitorowania zamówień zakupu, zapewniający wgląd i analizy dla efektywnego śledzenia i podejmowania decyzji.
Pulpit zamówień wysyłkowych: Podobnie jak PO Dashboard, ta funkcja skupia się na zarządzaniu i monitorowaniu zamówień wysyłkowych, ułatwiając płynne operacje logistyczne.
Dashboards v2: Zaktualizowana wersja interfejsu pulpitu, prawdopodobnie z ulepszonym doświadczeniem użytkownika i dodatkowymi funkcjami dla zwiększonej użyteczności.
Zaawansowany pulpit wysyłkowy: Specjalizowany pulpit do zaawansowanego monitorowania i zarządzania działaniami związanymi z wysyłką, oferujący głębsze spojrzenie i funkcjonalności.
Portal dostawcy: Portal w DocBits dedykowany dostawcom, umożliwiający im interakcję i współpracę z systemem, zwiększając komunikację i efektywność w łańcuchu dostaw.
Kreator workflow: Narzędzie do tworzenia i dostosowywania workflowów w DocBits, pozwalające użytkownikom zdefiniować konkretne procesy i automatyzować obsługę dokumentów zgodnie z ich unikalnymi wymaganiami biznesowymi.
Kreator układu: Umożliwia użytkownikom projektowanie i dostosowywanie układu dokumentów w DocBits, zapewniając zgodność z normami dotyczącymi marki i użyteczności.
Tryb adnotacji: Funkcja umożliwiająca użytkownikom adnotowanie i oznaczanie dokumentów bezpośrednio w interfejsie DocBits, ułatwiając współpracę i feedback.
Pokaż raport: Funkcjonalność generowania i wyświetlania raportów w DocBits, dostarczająca wglądu i analizy w różne aspekty przetwarzania i zarządzania dokumentami.
Modele i etykiety: Narzędzia do definiowania i konfigurowania modeli rozpoznawania dokumentów i etykiet w DocBits, zwiększające dokładność i efektywność w przetwarzaniu dokumentów.
Skrypt dokumentu: Prawdopodobnie funkcja do skryptowania i automatyzacji określonych działań lub procesów związanych z obsługą dokumentów w DocBits.
Skanowanie dokumentów: Możliwość skanowania dokumentów fizycznych i importowania ich do DocBits w celu cyfrowego przetwarzania i zarządzania nimi.
Ekstrakcja kodów QR: Funkcja do wyodrębniania informacji z kodów QR osadzonych w dokumentach, umożliwiająca automatyczne przechwytywanie i przetwarzanie danych.
Niestandardowe dane podstawowe: Umożliwia użytkownikom definiowanie i zarządzanie niestandardowymi polami i atrybutami danych podstawowych w DocBits, dostosowując system do konkretnych potrzeb biznesowych.
Zadania i powiadomienia: Funkcjonalność do zarządzania zadaniami i otrzymywania powiadomień w DocBits, zapewniająca terminowe działania i komunikację dotyczącą działań związanych z przetwarzaniem dokumentów.
IDM ACL Updater: Ten moduł prawdopodobnie zajmuje się aktualizacją i zarządzaniem listami kontroli dostępu (ACL) w DocBits, zapewniając odpowiednie uprawnienia i bezpieczeństwo dostępu do dokumentów.
Jakość OCR:
To ustawienie pozwala zdefiniować minimalny wymagany procent jakości OCR dla automatycznej obróbki dokumentu. Zawiera suwak, który można dostosować, aby ustawić próg, na przykład 75%, poniżej którego system nie może zagwarantować udanej ekstrakcji. Jeśli jakość OCR dokumentu spadnie poniżej tego progu, można wybrać, aby system podjął konkretne działania, takie jak ponowne uruchomienie OCR po ręcznym potwierdzeniu.
Ogólne ustawienia OCR:
Użyj tekstu elektronicznego, jeśli jest dostępny: Ta opcja, gdy jest włączona, pozwala systemowi wykorzystać tekst elektroniczny osadzony w plikach PDF lub innych typach dokumentów w celu poprawy dokładności.
Użyj DESKEW, jeśli jest dostępny: Ta funkcja koryguje wyrównanie dokumentu, co pomaga poprawić dokładność OCR poprzez wyprostowanie skośnych skanów.
Ustawienia OCR dla tabel:
Użyj danych AI dla tabel, jeśli są dostępne: To ustawienie umożliwia wykorzystanie technologii AI do lepszego rozpoznawania i ekstrakcji tabel z zeskanowanych dokumentów, wykorzystując modele uczenia maszynowego do dokładnego identyfikowania i strukturyzowania danych tabelarycznych.
Ustawienia OCR dla pól nagłówka:
Użyj ekstrakcji reguł: Gdy jest włączone, to ustawienie pozwala systemowi zastosować predefiniowane reguły do ekstrakcji danych z pól nagłówka, co może być kluczowe dla poprawnego identyfikowania sekcji dokumentów, takich jak numery faktur, daty, itp.
Użyj ekstrakcji AI: Ta opcja wykorzystuje modele AI do inteligentnej ekstrakcji pól nagłówka, co może być bardziej elastyczne wobec zmian w układach i stylach dokumentów.
DocBits Workflow Engine
In the realm of process automation, the DocBits Workflow Engine stands out for its straightforward and intuitive approach. This engine streamlines the creation and management of workflows through a clear and simple rule-based structure. The core principle behind its design is the "Easy When-And-Then" rule, which breaks down the automation process into three fundamental components: Trigger, Condition, and Action. This guide explains how each component functions within the system and how they work together to facilitate seamless automation.
The DocBits Workflow Engine operates on a basic yet powerful principle that makes workflow automation accessible to users of all skill levels. This principle is encapsulated in the "Easy When-And-Then" rule, which can be broken down as follows:
Definition: The "When" component represents the Trigger of the workflow. This is the event or condition that initiates the execution of the workflow. Triggers can be a variety of occurrences such as the arrival of a new email, a specific date and time, or the creation of a new document.
Example: "When a new customer form is submitted..."
Definition: The "And" component introduces the Condition that must be met for the action to proceed. Conditions serve as filters or criteria that refine the trigger, ensuring that the workflow is executed only under certain circumstances.
Example: "...and the customer is from the 'Enterprise' segment..."
Definition: Finally, the "Then" part specifies the Action to be taken once the trigger occurs and the condition is satisfied. Actions are the tasks or operations executed by the workflow, such as sending an email, updating a database, or creating a task in a project management tool.
Example: "...then assign the lead to the enterprise sales team and send a welcome email."
By assembling these three components—Trigger, Condition, and Action—the DocBits Workflow Engine allows users to create highly customized and efficient workflows. This modular approach not only simplifies the process of setting up automations but also offers the flexibility to create complex workflows capable of handling a wide range of tasks and processes.
The DocBits Workflow Engine's "Easy When-And-Then" rule exemplifies the engine's commitment to providing a user-friendly platform for automating processes. This straightforward rule, by breaking down automation into the essential elements of Trigger, Condition, and Action, makes it easier for users to conceptualize, create, and manage workflows. Whether you are new to workflow automation or an experienced professional, the DocBits Workflow Engine offers an efficient and accessible tool to enhance productivity and streamline operations.
1. Document Field Actions:
Invert Checkbox: This action toggles the state of a checkbox field in a document.
Set Checkbox: This sets the state of a checkbox field to either true (checked) or false (unchecked).
Set Field to Text: This action sets a specified document field to a given text value.
2. Document Actions:
Approve the Document: Marks a document as approved within the system.
Start Export: Initiates the export process for a document.
Reject the Document: Marks a document as rejected.
3. Status Actions:
Change Status: Changes the status of a document or task to a specified new status.
4. Task Actions:
Assignments and notifications:
Assign Task: Creates and assigns a task with specific details to an individual or group, including options to notify them via email.
Create a New Task: Similar to assign but focused on setting up a completely new task within the system.
5. Table Actions:
Calculate in Table: Performs calculations on table data based on specified conditions and stores the results in a designated column.
Change Entries: Updates entries in a table based on specified conditions.
6. Assignee Actions:
Assign User from Field: Assigns a user to a task or document based on user data stored in a specific field, with an option for a fallback user if the primary is unavailable.
Assign Document to User or Group: Directly assigns a document to a user or group, ensuring responsibility is designated appropriately.
7. External Interaction Actions:
Call API: Sends a request to an external API, which can be customized with specific methods, parameters, and data.
Send HTTPS Request: Similar to API calls but specifically formatted for HTTPS protocols.
8. Advanced Processing:
Run Workflow: Triggers another workflow within the system, allowing for complex process chaining.
These action cards are used to automate responses based on specific triggers identified in the earlier parts of the workflow setup. For instance:
If a document is identified as needing review, the "Approve the Document" action can be automatically triggered once it passes all specified conditions.
For data management tasks, "Set Checkbox" or "Set Field to Text" actions ensure that document fields are updated automatically, reducing manual data entry and the potential for errors.
Complex tasks like API interactions or status changes streamline interactions not only within the ERP system but also with external services and tools, enhancing integration and functionality.
The "Then..." section in your workflow system provides robust tools for defining precise actions that should occur as a result of conditions being met in the workflow. By effectively using these actions, businesses can automate routine processes, ensure data accuracy, and respond dynamically to changing information and system states. Understanding how to configure and utilize these actions is key to maximizing the efficiency and effectiveness of your ERP system's workflow capabilities.
And cards serve as condition cards that specify criteria that must be met for the workflow to continue. They effectively act as logical "AND" operators, meaning all conditions specified in these cards must be satisfied for the subsequent action to be triggered.
From the screenshots, it's clear that these cards cover a wide range of conditions, which include:
Compare with Purchase Order:
Conditions related to validation and comparison against purchase orders, such as comparing delivery dates, unit prices, or quantity differences. These are crucial for ensuring that transactions align with agreed terms.
Document Field:
These involve conditions based on specific fields within documents, such as checkboxes being marked, comparison of field values, or ensuring a document field meets a specified tolerance. This is particularly important for data integrity and automated checks within forms or document management systems.
Document:
Conditions based on document characteristics, such as type or association with a particular sub-organization. These conditions can direct workflows based on document categorization or departmental involvement.
Logic:
Logical conditions that might involve evaluations like "Continue with a chance of X%" or executing HTTPS requests, which are vital for integrations and probabilistic decision-making within workflows.
Status:
Focusing on the status of documents or tasks, these conditions ensure that only items in certain states trigger specific workflows, crucial for status-driven process management.
Table:
These involve conditions based on table data, such as matching regex patterns or comparing values within a table. Such conditions are essential for validating and manipulating large data sets.
Assignee:
Conditions based on task or document assignees. This ensures that actions are only taken when certain users are involved, enhancing accountability and task specificity.
These "And" cards are configured within the workflow to perform checks and validations that ensure the process adheres strictly to business rules and data integrity standards. For example:
A workflow might use an 'And' card to verify that an invoice's total amount matches the purchase order before triggering payment.
Another workflow could use an 'And' card to ensure a document is reviewed by specific team members before it progresses to the next stage.
"And" cards are a fundamental component of workflow systems that require precise control over process execution based on multiple conditions. They ensure that each step of a workflow only proceeds when all necessary criteria are thoroughly met, thus automating complex decision trees within business processes.
Understanding and configuring these cards correctly is crucial for leveraging the full capabilities of your workflow management system to enhance efficiency, accuracy, and compliance within organizational processes.
Ustawienie "Lista Wartości" w systemie przetwarzania dokumentów stanowi centralne repozytorium do zarządzania predefiniowanymi wartościami, które mogą być używane w różnych formularzach i polach w systemie. To ustawienie jest szczególnie przydatne do standaryzacji wprowadzania danych i zapewnienia spójności w dokumentach przetwarzanych przez system.
Segmentacja według Typu: Każda sekcja, takie jak "ISO_Currency", "Invoice_Sub_Type", "Invoice_Type" i "Test", reprezentuje inną kategorię predefiniowanych wartości. Te kategorie służą do zarządzania różnymi rodzajami danych wejściowych istotnymi dla konkretnych kontekstów w systemie.
Wartości i Synonimy: W ramach każdej kategorii możesz zdefiniować wiele wartości. Na przykład, pod "Invoice_Sub_Type" wymienione są wartości takie jak "Faktura kosztowa" i "Faktura zakupu". Dodatkowo, masz możliwość przypisywania synonimów do tych wartości, zwiększając elastyczność i zakres przechwytywania danych. Na przykład, "Faktura kosztowa" ma synonimy takie jak "Kostenrechnung" i "Faktura kosztowa".
Wykorzystanie w Podorganizacjach: Te wartości mogą również być specyficzne dla określonych podorganizacji w Twoim ustawieniu, co pozwala na dostosowanie i lokalizację przepływów pracy związanych z przetwarzaniem dokumentów.
Dodawanie i Zarządzanie Wartościami: Poprzez akcję "Dodaj wiersz" możesz wprowadzać nowe wartości i synonimy do list, a "Akcje" pozwalają na edycję lub usuwanie istniejących wpisów.
Purpose
This workflow card manages the execution of operations based on whether a task or document is assigned to a particular user or set of users. It employs conditional logic to either trigger or prevent specific actions, making it ideal for workflows that require user-specific handling.
Components of the Card
Operator
Description: Defines the logical condition to apply to the user assignment.
Options:
IS: Triggers the operation if the assigned user of the document or task matches any user in the specified list.
IS NOT: Triggers the operation if the assigned user of the document or task does not match any user in the specified list.
User List
Description: A list or selection of users to compare against the assigned user.
Detail: This list can include one or multiple users, allowing the card to handle both singular and multiple user conditions effectively. The selection can be made through checkboxes, a multi-select dropdown, or similar UI elements.
Functionality
User Assignment Identification: Automatically identifies the user or users assigned to a particular task or document within the ERP system.
Condition Evaluation:
Using the IS operator, the card checks if the assigned user is among those listed in the User List.
Using the IS NOT operator, the card ensures the assigned user is not among those listed.
Action Execution:
True Condition: If the user assignment meets the condition (either IS or IS NOT), relevant actions are triggered, such as notifications, task initiations, approvals, or other workflow steps.
False Condition: If the condition is not met, the document or task may pass through different routing, or alternative actions may be specified.
User Interactions
Setup and Configuration: Users configure the card by selecting an operator and specifying the relevant users from the User List. Setup should be user-friendly and intuitive to accommodate selections from potentially large user bases.
Monitoring and Reporting: The ERP system should provide functionality to monitor and report on the operations triggered by this card, offering insights into assignment accuracy and process efficiency.
Error Handling and Notifications: Users should have options to receive alerts or notifications if there are issues with the assignments, such as unassigned tasks or errors in user selection.
The "Assigned User Condition" workflow card is a critical tool for managing document and task workflows that depend on user assignments. By allowing conditions based on whether a task or document is assigned to specific users, it ensures that workflows are only triggered by appropriate user interactions, enhancing both accountability and task alignment within teams. Clearly documenting this card will help users understand its significance and integrate it effectively into their workflows, ensuring smooth and efficient operations tailored to user roles and responsibilities.
Purpose
This workflow card is designed to automate actions based on the state (checked or unchecked) of a checkbox within your ERP system. By evaluating the checkbox's condition, it facilitates the triggering of specific processes or the enforcement of certain rules within the application.
Components of the Card
Field Name
Description: Specifies the name of the checkbox field that will be evaluated.
Detail: This should match the exact field label or identifier used in the system. It determines which checkbox's state is being monitored.
Boolean
Description: Defines the condition that triggers the workflow.
Options:
True: The workflow triggers if the checkbox is checked.
False: The workflow triggers if the checkbox is unchecked.
Functionality
State Detection: The card continuously monitors the state of the specified checkbox field.
Condition Evaluation:
The system checks whether the checkbox is in the state (checked or unchecked) specified by the Boolean condition.
Action Execution:
True Condition: If the checkbox’s state matches the specified Boolean condition (either true for checked or false for unchecked), the system initiates the associated actions. These could include enabling or disabling form fields, triggering notifications, starting workflows, or updating records.
False Condition: If the checkbox’s state does not match the condition, alternative or no actions may be taken, depending on the workflow setup.
User Interactions
Setup and Configuration: Users configure the card by selecting the checkbox field from a list of available fields and setting the Boolean condition. This setup process should be intuitive, typically involving a simple dropdown menu for field selection and a toggle for the Boolean condition.
Monitoring and Reporting: Provides functionalities for users to monitor the status of this condition, possibly through a dashboard that shows real-time updates on which conditions are active or triggered.
Error Handling and Notifications: Ensures that users are notified if there are any discrepancies or errors in the condition checking process, such as system failures to read the checkbox state.
The "Checkbox Field Condition" workflow card is a fundamental tool for managing dynamic forms and documents within an ERP system, where user inputs can dictate subsequent data processes. By automating actions based on the state of a checkbox, this card enhances workflow efficiency and ensures that system behaviors align with user inputs. Clear documentation of this card will help users effectively implement it within their operations, allowing for better control over form behaviors and process automations.
Purpose: This Docbits card is designed to streamline the verification process of invoices by comparing the total calculated price from the invoice against the corresponding purchase order.
Functionality:
Combined Price of Quantity Difference: The card calculates the total price by multiplying the quantity of each item listed on the invoice by the price per unit and then subtracts this total from the amount listed on the related purchase order.
Operator Value: Users can set conditions to determine how the calculated total price difference should be compared to the purchase order amount. The following operators are available:
Equals (=): Checks if the total invoice amount is exactly the same as the purchase order amount.
Not Equal (≠): Verifies that the total invoice amount differs from the purchase order amount.
Greater Than (>): Ensures the invoice amount is greater than the purchase order amount.
Less Than (<): Confirms the invoice amount is less than the purchase order amount.
Usage: This card is particularly useful for ERP Managers and financial accountants who need to automate and error-proof the reconciliation of invoices against purchase orders, ensuring financial accuracy and preventing overpayments or underpayments.
Example Scenario:
An invoice lists a total of 100 units of a product at $50 per unit, totaling $5000. The related purchase order authorized a purchase of $4500. Using the "Greater Than" operator, the card identifies and flags the discrepancy for review.
By using the "Compare with Purchase Order" card, users can automatically ensure that payments are consistent with purchase agreements, saving time and reducing human error in financial processing.
\
This Card is intended to compare whether the selected fields in a document are equal/not equal, greater than... or less than... .
This card should automatically compare whether the fields, for example between net amount and gross amount, are within the specified tolerance. For example, you then enter the percentage of the VAT in the tolerance amount and enter percent in the tolerance type. The workflow can check whether the amount is correct.
Tolerance type: percent or value
Purpose: This Docbits card is designed to ensure that the confirmed delivery dates on invoices or shipping documents align with the accepted delivery dates as stipulated in the master data lookup table. It helps manage expectations and adherence to scheduled deliveries within the supply chain.
Functionality:
Confirmed Delivery Date: This component of the card captures the delivery date as confirmed on the invoice or shipping documentation.
Master Data Table Lookup: The card references a master data lookup table specified by the user (identified by the <Master Data Table> parameter). This table contains the accepted delivery dates for comparison.
Operator Value: Users can specify how the confirmed delivery date should compare to the accepted delivery date from the master data table. Available operators include:
Equals (=): Ensures that the confirmed delivery date is the same as the accepted delivery date.
Not Equal (≠): Indicates a discrepancy between the confirmed and accepted delivery dates.
Before (<): Verifies that the confirmed delivery date is earlier than the accepted delivery date.
After (>): Checks if the confirmed delivery date is later than the accepted delivery date.
Usage: This card is invaluable for ERP Managers and financial accountants who need to monitor and ensure compliance with delivery schedules. It is particularly useful in sectors where timely delivery is critical, such as manufacturing, retail, and distribution.
Example Scenario:
An invoice lists a confirmed delivery date of June 10th. The master data table, however, shows an accepted delivery date of June 15th. Setting the operator to "Before," the card confirms that the goods are scheduled for early delivery, allowing logistics planning to adjust accordingly.
By implementing the "Compare with Purchase Order: Confirmed vs. Accepted Delivery Dates" card, organizations can proactively manage their supply chain, ensuring that deliveries are planned and executed in accordance with agreed-upon timelines, thus enhancing operational efficiency and customer satisfaction.
\
This card can check whether, for example, the total amount in a document corresponds to the specified value - equal/not equal, larger or smaller.
Purpose
This workflow card is designed to automatically compare the values of two specified fields within a document based on a defined operator. It's used to enforce data integrity and ensure that document data conforms to business rules or conditions.
Components of the Card
Field Names
Description: Specifies the names of the two fields within the document that will be compared.
Detail: Users must input the exact names of the fields as they appear in the system. These fields can be any data type that supports comparison, such as numeric, date, or text fields.
Operator
Description: The comparison operator used to evaluate the relationship between the values of the two fields.
Options:
Equal (==): Checks if the value of the first field is equal to the value of the second field.
Not Equal (!=): Checks if the value of the first field is not equal to the value of the second field.
Greater Than (>): Checks if the value of the first field is greater than the value of the second field.
Greater Than or Equal (>=): Checks if the value of the first field is greater than or equal to the value of the second field.
Less Than (<): Checks if the value of the first field is less than the value of the second field.
Less Than or Equal (<=): Checks if the value of the first field is less than or equal to the value of the second field.
Functionality
Field Selection: Users input or select the names of the two fields to be compared. This is typically done through a form or a dropdown menu within the card setup.
Operator Selection: Users choose an operator from a list of available options that define how the fields should be compared.
Comparison Execution:
The system reads the values from the specified fields and applies the selected operator to evaluate the relationship between them.
Based on the result of the comparison (true or false), subsequent actions may be triggered. For example, if a comparison fails, the system might flag the document for review, block further processing, or notify responsible parties.
User Interactions
Setup and Configuration: Users configure the comparison by entering field names and selecting an operator. This setup should be straightforward and guided, possibly with help text or examples.
Monitoring and Reporting: The system can provide feedback on the results of comparisons, such as logging all comparisons made, their outcomes, and any actions taken in response to the comparison results.
Error Handling and Notifications: Users receive alerts if the comparison cannot be executed (e.g., if one of the fields is not found in the document or is not in a comparable format).
The "Document Field Comparison" workflow card is vital for maintaining data accuracy and consistency across documents in an ERP system. It helps automate checks that would otherwise be manual, error-prone, and time-consuming, enhancing efficiency and reliability in document processing. Documenting this card clearly in your ERP system's manual will assist users in effectively employing this feature, ensuring that data across documents remains consistent and in accordance with business rules.
Purpose
This card is designed to control workflow actions based on the current status of a document, using conditional logic to either trigger or restrict certain processes. It ensures that documents only proceed through workflows when they meet predefined status criteria.
Components of the Card
Operator
Description: Determines how the document status will be evaluated against a specified condition.
Options:
is: Triggers the associated actions if the document’s current status matches one of the specified statuses.
is not: Triggers the actions if the document’s status does not match any of the specified statuses.
Status ( List )
Description: Lists the specific statuses against which the document’s current status will be compared.
Examples: "Error", "Export Error", "Ready in Validation", "Ready in Review", "Pending Approval", "Pending Second Approval". These represent different stages or conditions a document might be in within a workflow process.
Functionality
Status Identification: Automatically identifies the current status of a document as it moves through the ERP system’s workflow.
Condition Evaluation: Applies the chosen operator (is or is not) to the document’s status in comparison to the listed statuses:
If is, it checks whether the document’s status matches any status in the list.
If is not, it checks whether the document’s status does not appear in the list.
Action Execution: Depending on the outcome of the condition evaluation:
True: Executes predefined actions or workflows if the condition is met.
False: Skips or triggers alternative workflows if the condition is not met.
Workflow Integration: Integrates seamlessly with other workflow components, ensuring that document handling is coordinated across the system.
User Interactions
Setup and Configuration: Users configure the card by selecting the operator and specifying the relevant statuses. This setup may involve simple dropdown menus or checkboxes for selecting statuses and operators.
Monitoring and Management: Users can track the card’s activity via a dashboard, which provides insights into the status conditions being monitored and the actions being taken based on those conditions.
Error Handling and Alerts: Supports setting up alerts for process failures or mismatches in expected document statuses, enabling quick responses to operational issues.
The "Document Status Condition" workflow card is vital for ensuring that documents are processed correctly according to their current status, enhancing control and efficiency within the ERP system. Clearly documenting this card in the system's manual will help users effectively implement and manage it, leveraging its functionality to maintain smooth and compliant document workflows. This card is particularly useful in managing document lifecycles and ensuring that only documents meeting specific criteria advance to subsequent stages of business processes.
Purpose
This card is designed to manage actions on documents contingent upon their type, employing simple conditional logic (is/is not) to either trigger or prevent specific workflows. This enables precise control over how different types of documents are processed within the ERP system.
Components of the Card
Operator
Description: Determines the conditional logic applied to the document types.
Options:
is: The operation will trigger if the document's type matches one of the specified types in the list.
is not: The operation will trigger if the document's type does not match any of the types listed.
Document Types List
Description: Specifies a list of document types to which the condition will apply.
Detail: This can include a variety of document types such as "Invoice", "Purchase Order", "Contract", "Employee Record", etc., based on which the condition (is/is not) will be evaluated.
Functionality
Document Identification: The system first identifies the type of each incoming or existing document based on predefined attributes or metadata.
Condition Evaluation:
If the operator is is, the card checks if the document type is in the provided list.
If the operator is is not, the card checks if the document type is not in the list.
Action Triggering: Depending on the result of the condition evaluation:
True: Initiates the associated operations or workflows if the condition is met.
False: The process is bypassed or an alternative operation is triggered if the condition is not met.
Integration and Automation: Seamlessly integrates with other system components, ensuring that document handling is automated and adheres to organizational workflows and policies.
User Interactions
Configuration: Users must specify the operator and list the document types when setting up the card. This setup may include interface elements like dropdowns or checkboxes to select document types and operators.
Monitoring and Adjustments: Users can monitor the outcomes and effectiveness of this card through logs and reports generated by the ERP system. Adjustments can be made to the list or the operator based on evolving business needs.
Error Handling and Feedback: Provides feedback mechanisms for errors encountered during operation. Users can set up alerts for when conditions fail, ensuring prompt attention to issues.
The "Document Type Condition" workflow card plays a crucial role in managing document-based operations with precision and flexibility. By using simple conditional logic, it helps ensure that documents are processed appropriately, enhancing efficiency and compliance. Documenting this card clearly will help users understand how to implement and utilize it effectively, making it a valuable part of your ERP system's documentation.
In Order Confirmation Purchase Order
This logic card is designed to automatically verify that the quantity, unit price, or discount detailed in an order confirmation matches the corresponding figures in the purchase order. This verification ensures consistency and accuracy between what was ordered and what the supplier confirms to deliver.
The logic is activated when any of the following conditions are met in an order confirmation relative to the original purchase order:
Quantity: The quantity of items ordered matches the quantity confirmed by the supplier.
Unit Price: The price per item agreed upon matches the supplier's confirmation.
Discount: Any discounts applied are consistent between the purchase order and the order confirmation.
Equals: If the order confirmation's quantity, unit price, or discount exactly matches the purchase order, the system considers the confirmation as valid and proceeds with the next steps in the procurement process.
Not Equal: If there's a discrepancy in the quantity, unit price, or discount, the system flags the order confirmation for manual review. This ensures any mismatches are resolved before moving forward.
Accuracy and Consistency: Maintains accuracy in the procurement process, ensuring that payments and deliveries are made based on correct figures.
Efficiency: Automates the verification process, reducing the need for manual checks and speeding up order processing.
Cost Control: Helps prevent overpayments or incorrect deliveries by catching discrepancies early in the process.
Define Comparison Parameters: Set up the specific fields (quantity, unit price, discount) that the logic card will check for a match.
Automate Verification: Configure the system to automatically compare these details upon receipt of an order confirmation.
Customize Alerts: Decide on the workflow for handling discrepancies, including customization of alerts for manual review.
This logic card is vital for ensuring that the details of an order confirmation align with the original purchase order, safeguarding the integrity of the procurement cycle. ``
Purpose
This workflow card facilitates operations based on the assignment of a task or document to a single, specific user. Using a direct conditional logic approach, it manages workflows that require targeted user engagement, ensuring precision in user-based task handling.
Components of the Card
Operator
Description: Specifies the logic to apply to the user assignment.
Options:
IS: Triggers the operation if the assigned user of the document or task matches the specified user.
IS NOT: Triggers the operation if the assigned user does not match the specified user.
User
Description: Allows selection of a single user against whom the assigned user will be compared.
Detail: This involves a simple dropdown or autocomplete field where one user can be selected at a time.
Functionality
User Assignment Identification: Identifies the user currently assigned to a specific task or document.
Condition Evaluation:
For the IS operator, the card checks if the assigned user is the same as the user selected.
For the IS NOT operator, it verifies that the assigned user is different from the selected user.
Action Execution:
True Condition: If the assignment meets the set condition (IS or IS NOT), it triggers predefined actions, which could include moving forward with approvals, initiating further tasks, sending notifications, or other related workflows.
False Condition: If the condition fails, the system can reroute the task, hold it for review, or trigger alternative predefined actions.
User Interactions
Setup and Configuration: Users set up the card by choosing an operator and selecting a user from the user field. This setup should be straightforward, ensuring easy user selection and configuration.
Monitoring and Reporting: Offers tools for monitoring the card’s performance, such as tracking which tasks are triggered by specific user assignments and the outcomes of these triggers.
Error Handling and Notifications: Provides mechanisms to alert users if tasks are incorrectly assigned or if operational errors occur due to assignment issues.
The "Single Assigned User Condition" workflow card is essential for precise, user-specific document and task management within an ERP system. It simplifies workflows by focusing on individual user assignments, thus ensuring that actions are only executed when appropriate, based on the user's role and responsibilities. Documenting this card clearly will assist users in understanding its application, allowing them to implement and manage it effectively within their daily operations. This documentation ensures that all potential users can easily grasp the card's purpose and integrate it seamlessly into their workflows.
Purpose
This workflow card is tailored to manage operations on documents based on a single, specified document status. By simplifying the condition to one status, the card is focused on very specific workflow triggers, making it ideal for targeted document processing activities within an ERP system.
Components of the Card
Operator
Description: Specifies the method for evaluating the document’s status against the selected condition.
Options:
is: Triggers the operation if the document's current status matches the selected status.
is not: Triggers the operation if the document's current status does not match the selected status.
Status
Description: Allows the selection of a single document status to set the condition.
Examples of Statuses: "Error", "Export Error", "Ready in Validation", "Ready in Review", "Pending Approval", "Pending Second Approval".
Detail: Users choose one status from a dropdown or a set of radio buttons. This status then serves as the criterion for the card’s operation.
Functionality
Document Status Identification: Identifies the current status of a document as it is processed through the ERP system.
Condition Evaluation:
Based on the operator selected (is
or is not
), the card checks whether the document's current status aligns with the chosen status criterion.
Action Execution:
True Condition: If the status matches (or does not match, based on the operator), the corresponding action is initiated. This could be routing for further processing, notification generation, or other predefined workflows.
False Condition: If the condition is not met, no action is taken, or an alternate pathway is triggered.
Integration with Other Workflows: Even though it's designed for single-status evaluation, this card can be effectively integrated into broader workflow sequences to ensure precise document handling.
User Interactions
Setup and Configuration: Users set up the card by selecting an operator and then choosing one status from the available options. This selection process is straightforward and designed to prevent confusion.
Monitoring and Reporting: Enables monitoring through system-generated reports or dashboards that track the processing of documents based on their status, helping to oversee the effectiveness of the implemented workflows.
Error Handling and Notifications: Configurable to alert users to any processing anomalies or to flag documents that do not meet the set conditions, ensuring prompt attention and resolution.
The "Single Document Status Condition" workflow card simplifies document management by focusing on individual status conditions. This specification helps in cases where precise control over document flows is necessary, especially in environments with stringent processing criteria. Documenting this version of the card clearly will ensure that users fully understand its application and can effectively integrate it within their daily operations, enhancing both compliance and efficiency in document processing.
This Docbits card allows detailed comparison, comparing the supplier of the invoice and that of the order confirmation. It should be ensured that the supplier who issued the invoice is the same as the one in the order confirmation.
Functionality:
Supplier on Invoice Supplier on Purchase Order: This card checks whether the supplier in the invoice is the same as in the order confirmation or not.
Operator Value: Users can set specific conditions such as: Is the supplier who issued the invoice the same as in the PO or not. Available operators include:
Equals (=): Checks whether the supplier in the invoice matches the supplier in the order confirmation.
Not Equal (≠): Ensures that the supplier who issued the invoice is the same as in the order confirmation.
Usage: This Card is helpful to ensure the entire process is handled with the same supplier and everything fits together. This ensures that if there are discrepancies, attention is drawn to checking these discrepancies and not paying the invoice to an incorrect supplier who has nothing to do with the order and order confirmation.
Example Scenario:
An order is placed, then the order confirmation comes and then the invoice is issued. The entire ordering process is carried out with one supplier. If this is not the case, the card can immediately determine that there are discrepancies between the suppliers and thus ensures that no incorrect payments are made and that the invoice is only made with the supplier who was also involved in the entire process.
By using the “Supplier on Invoice … Supplier on Purchase Order” card, companies can automate the verification of suppliers who issue invoices and the associated order confirmations.
This card is supposed to automatically check whether the text you are looking for is contained in the field in a document or not. If the entered text is not found, there are other options as to how to continue with the document. (“Then” workflow cards)
\
Purpose: This Docbits card facilitates the detailed comparison of unit prices on invoices against those specified in the corresponding purchase orders. It enhances accuracy in financial reporting by ensuring adherence to agreed pricing.
Functionality:
Unit Price Combined with Fields: This card calculates the total amount for a specific item by combining the unit price with additional specified fields from the invoice. The combined total is then compared against the purchase order's recorded unit price for that item.
Operator Value: Users can set specific conditions for how the invoice's combined unit price should be compared to the purchase order's unit price. Available operators include:
Equals (=): Verifies that the combined invoice price matches the price on the purchase order.
Not Equal (≠): Ensures the combined invoice price does not match the price on the purchase order.
Greater Than (>): Checks if the combined invoice price exceeds the price on the purchase order.
Less Than (<): Confirms the combined invoice price is below the price on the purchase order.
Usage: This card is particularly valuable for ERP Managers and financial accountants tasked with maintaining stringent control over purchasing and payment processes. It ensures that invoiced prices conform to those agreed upon in purchase orders, thereby mitigating financial discrepancies.
Example Scenario:
An invoice presents a unit price of $50 for a product. The "field name" specified includes an additional handling fee of $5 per unit. When combined, the total per unit amounts to $55. Using the "Equals" operator with a value set to $55, the card verifies that the invoiced price aligns with the purchase order, ensuring agreement compliance.
By deploying the "Compare with Purchase Order: Unit Price Combined" card, businesses can automate the verification of pricing accuracy against purchase orders, streamlining financial operations and safeguarding against overcharges.
\
Cel sekcji "Kiedy"
Sekcja "Kiedy" w konfiguracji przepływu pracy definiuje warunki wyzwalające określoną akcję w przepływie pracy. Warunki te opierają się na określonych kryteriach dotyczących atrybutów dokumentu lub aktywności użytkownika w systemie ERP.
Jak to działa
W interfejsie "Kiedy" wydaje się być punktem wyjścia, w którym użytkownicy mogą wybrać różne karty wyzwalaczy. Każda karta określa warunki, na podstawie których następne akcje (zdefiniowane w innej sekcji konfiguracji przepływu pracy, najprawdopodobniej oznaczonej jako "Akcja") zostaną wykonane.
Karty Warunków Typu Dokumentu
Karty wyświetlone na zrzucie ekranu są wariantami warunków "Typ Dokumentu", które służą do wyzwalania przepływów pracy na podstawie typu przetwarzanego dokumentu. Oto analiza każdej karty warunków typu:
Typ dokumentu (Operator) jeden z (Typ): Ta karta wyzwalająca akcję, gdy typ dokumentu pasuje do jednego z określonych typów na liście. Operator może zawierać opcje takie jak "jest" lub "nie jest", pozwalając na warunki włączające lub wyłączające.
Typ dokumentu (Operator) (Typ): Ten prostszy wariant wyzwalacz działa na podstawie pojedynczego warunku typu dokumentu. Zazwyczaj sprawdza, czy typ dokumentu "jest" lub "nie jest" określonym typem, bez możliwości wyboru spośród wielu typów.
Wybór Typu Warunku: Użytkownicy rozpoczynają od wybrania typu warunku, który jest istotny dla przepływu pracy, który chcą zautomatyzować. W tym przypadku skupiamy się na typach dokumentów.
Określenie Operatora: Użytkownicy muszą zdecydować o operatorze logicznym, takim jak "jest" lub "nie jest", który stanowi podstawę porównywania rzeczywistych typów dokumentów z zdefiniowanymi warunkami.
Określenie Typów Dokumentów: W zależności od karty, użytkownicy mogą wybrać jeden lub wiele typów dokumentów, które wywołają przepływ pracy, gdy dokumenty tych typów zostaną przetworzone.
Finalizacja Wyzwalacza: Po skonfigurowaniu warunku staje się on podstawą do wyzwalania określonych akcji zdefiniowanych w przepływie pracy. Jeśli dokument spełnia ustawiony warunek, zdefiniowane akcje zostaną automatycznie uruchomione.
W praktyce te karty wyzwalaczy są kluczowe do automatyzacji procesów, takich jak zatwierdzanie, powiadomienia lub dowolna procedura zależna od typu dokumentu. Na przykład, jeśli typ dokumentu "jest" "Fakturą" i spełnia warunki ustawione na karcie "Kiedy", przepływ pracy może automatycznie przekierować dokument do przetwarzania płatności.
Ta konfiguracja zapewnia, że przepływy pracy są nie tylko wydajne, ale także dostosowane do konkretnych potrzeb operacyjnych organizacji, zmniejszając ręczne nadzorowanie i przyspieszając procesy obsługi dokumentów.
Podsumowując, część "Kiedy" w konfiguracji przepływu pracy służy do ustawiania sceny dla automatycznych działań na podstawie określonych, wstępnie zdefiniowanych warunków. Jest to potężne narzędzie zapewniające, że system ERP reaguje dynamicznie na potrzeby biznesu, poprawiając zarówno produktywność, jak i dokładność zarządzania dokumentami.
Purpose
This workflow card is designed to perform specific operations on documents that are associated with a particular sub-organization or department within a company. It ensures that document processing adheres to the policies and requirements specific to different segments of the organization.
Components of the Card
Operator
Description: Defines the action or set of actions to be performed on the document.
Examples: This could include operators like "Review", "Approve", "Archive", "Distribute", or any other custom operation relevant to document management within the organization.
Sub-Organization
Description: Specifies the part of the organization or department for which the document operation is relevant.
Detail: This could be any designated area of the company, such as Human Resources, Finance, Marketing, etc., or smaller, specialized teams within these broader categories.
Functionality
Document Identification: The card first identifies the document(s) that need to be processed. This identification could be based on document type, source, date, or any other metadata.
Operation Execution: Based on the specified operator, the card executes the designated operation. This could be:
Review: Sending the document to the appropriate personnel or department for review.
Approve: Routing the document for necessary approvals within the sub-organization.
Archive: Moving the document to an archival system designed to store records as per organizational policies.
Distribute: Disseminating the document internally within the sub-organization or externally if required.
Compliance Checks: The card checks that all operations comply with the internal policies and legal requirements applicable to the specific sub-organization.
Feedback and Logging: Post-operation, the card provides feedback on the action taken and logs this information for audit trails and compliance tracking.
User Interactions
Configuration: Users set up the card by specifying the operator and the sub-organization. They might also define specific rules or triggers for when the card should activate.
Monitoring: Users can monitor the card's activity via a dashboard that shows ongoing and completed operations, providing transparency into document handling processes.
Manual Override: In some cases, users might have the ability to manually intervene or alter the course of an operation, such as escalating an issue or correcting document routing errors.
The "Document Operator for Sub-Organizations" card is a crucial tool for managing documents in a structured and efficient manner, particularly in larger organizations where different departments have unique operational needs and compliance requirements. Documenting this card clearly in your ERP system's manual will help users understand its importance and implement it effectively within their workflows. If additional customization or functionality descriptions are needed, feel free to expand based on specific organizational needs and technical capabilities.
Purpose
This workflow card is specifically designed to perform predefined operations on documents categorized by type. It streamlines the handling of various document forms within an ERP system, ensuring that each type is processed according to its unique requirements and organizational policies.
Components of the Card
Operator
Description: Specifies the action to be executed on the document.
Examples: Common operations include "Validate", "Store", "Process", "Send", etc. Each operator defines a set of tasks that the system automates based on the document type.
Document Type
Description: Identifies the category of the document to which the operation will apply.
Detail: Types could include invoices, purchase orders, contracts, employee records, etc. Each type has specific rules and workflows associated with it.
Functionality
Document Classification: Automatically identifies and classifies documents as they enter the ERP system based on their metadata, content, or other identifiers.
Operation Execution: Executes the specified operation for documents of the identified type. This execution could involve:
Validate: Checking the document for completeness, correctness, and compliance with standards.
Store: Saving the document in the designated repository with proper indexing.
Process: Applying business logic to the document, such as calculating totals on an invoice or updating database records.
Send: Distributing the document to other business units or external partners based on workflow requirements.
Compliance and Security: Ensures that all operations adhere to regulatory compliance and security protocols specific to the document type.
Automation and Integration: Seamlessly integrates with other workflows in the ERP system, facilitating automated transitions between different operational stages.
User Interactions
Setup and Configuration: Users configure the card by defining both the operator and the document type. Additional parameters might be set depending on the complexity of the operation.
Monitoring and Reporting: Users can monitor the operations applied to various document types through a dashboard that provides real-time status updates, logs, and reports.
Error Handling and Overrides: Provides mechanisms for handling errors or exceptions during operation execution. Users can intervene manually if necessary, adjusting processes or rerouting documents.
The "Document Type Operation" workflow card is a vital component for managing document-based processes within an ERP system efficiently. It automates routine tasks, reduces errors, and ensures consistency across similar types of documents, thereby enhancing overall productivity and compliance. Documenting this card effectively in your system's manual will assist users in understanding its functionality and how to leverage it to optimize document management processes in their daily operations. If there are additional specific details or examples that need to be included based on your ERP system's capabilities or industry-specific needs, those should be tailored accordingly.
This workflow outlines the conditions under which an export process should be initiated. It ensures that only documents meeting all specified criteria are processed for export, enhancing data integrity and alignment with business rules.
A document within the system is evaluated for export eligibility.
Document Type Check
The document must be of a certain type (e.g., "Invoice" or "Receipt"). Specify the document type that qualifies for the export process.
Status Verification
The document's current status must meet predefined criteria (e.g., "Approved" or "Ready for Export") indicating it is ready for further processing.
Contextual Conditions
Additional checks are performed to ensure the document's details align with specific requirements. These checks might involve verifying information within order confirmations or purchase orders. Specify the particular conditions that need to be met. For example:
All items listed in the order confirmation match those in the purchase order.
The total amount in the order confirmation matches the total amount in the purchase order.
The delivery dates specified in the order confirmation align with those in the purchase order.
Initiate Export
If all the above conditions are satisfied, the system automatically starts the export process for the document.
This may involve generating an export file, sending data to an external system, or triggering a workflow in another application.
AP Invoice Email: The process likely begins with an invoice received via email.
DocBits: This tool might be used for initial document management tasks such as capturing and digitizing invoices.
Finance Review: Invoices undergo a finance review where decisions are made regarding their validity and accuracy.
Initial Review:
Invoices are received and initially processed using DocBits.
They are then reviewed by the finance team to ensure they are removed from the workflow if they are complete, or pushed forward for further processing.
PO vs Non-PO Invoices:
The workflow distinguishes between PO-related and non-PO invoices.
Non-PO invoices are routed for further approval or rejection based on predefined criteria like supplier ID, quantity, unit price, and item number.
Matching and Mismatching:
Invoices are checked against goods receipts to ensure that details match (like supplier ID and quantity).
If mismatches occur, the invoice is subject to further review and possibly rejection.
Finance and Buyer Review:
For PO-related invoices, a detailed matching process is conducted involving a buyer review.
Adjustments to purchase orders or goods receipts might be required.
Final Decisions:
Invoices that pass all checks are approved and integrated into financial systems for record-keeping.
Rejected invoices trigger notifications, and a new invoice may be requested by the buyer.
Integration with Infor IDM & LN+M3:
Approved invoices are likely sent to Infor's IDM for document management and LN for ledger noting.
This integration ensures that all financial records are up-to-date and that the workflow seamlessly feeds into the broader ERP system.
Throughout the workflow, there are various decision points where an invoice might be approved, rejected, or sent back for additional information. Notifications are sent out after delays, ensuring timely processing.
These Workflows will be included in the Standard Workflow
This title indicates that the rule is designed to manage cases where the invoice total is greater than the maximum amount an approver is authorized to handle.
When…
Document Type is Invoice: This condition ensures that the rule applies only to invoices, which is essential for directing the workflow correctly.
And…
Document Status is Pending Approval: The invoice must be in a "Pending Approval" status. This status is crucial to ensure that the rule is applied to invoices that are still being processed and have not yet been finalized.
Compare two fields: Total Amount Greater Than Approver Max Amount: This condition checks if the invoice's total amount exceeds the maximum amount an approver is allowed to handle. This comparison might also include a tolerance setting, allowing for minor variations based on predefined criteria.
Assign user from field Next Level Approver, use user User as fallback: If the invoice exceeds the specified maximum amount, it is automatically assigned to a higher-level approver, indicated by the 'Next Level Approver' field. If this field is not filled or the specified user is unavailable, a default user (likely an admin or another designated staff member) is used as a fallback to ensure the invoice is reviewed without delay.
Add Card: This option allows additional conditions or actions to be added to the rule, providing flexibility to address complex scenarios.
Save: This button saves the rule configuration to the system.
The purpose of this rule is to ensure that invoices which exceed certain financial thresholds are reviewed by approvers with the appropriate authorization levels. This helps in maintaining financial control and oversight, ensuring that expenditures are reviewed by personnel with the requisite approval limits, thereby safeguarding the organization against unauthorized or inappropriate expenditures.
This rule, like the previous one, helps automate the workflow, reducing manual effort and enhancing compliance with the organization's financial policies. It is an example of how workflow automation can be effectively used to manage complex financial processes within a company.
This title indicates that the rule is specifically configured for managing cost invoices and involves an export action, possibly for reporting, further processing, or integration with other systems.
When…
Document Type is Invoice: This condition ensures that the rule is triggered only for documents categorized as invoices, maintaining the workflow's specificity to invoice management.
And…
Document Field Invoice Sub Type is Equals Cost Invoice: This specifies that the rule applies only to those invoices that are explicitly marked as "Cost Invoices" in a particular field within the document. This helps in distinguishing them from other types of invoices.
Document Status is Pending Second Approval: The invoice must be in a "Pending Second Approval" status. This indicates that the invoice has already undergone an initial approval and is awaiting a second, possibly final, review.
Start Export: Once the invoice meets the specified conditions (being a cost invoice and pending second approval), the action to "Start Export" is executed. This could involve sending the invoice data to another system for financial analysis, reporting, or compliance purposes.
Workflow Efficiency: This rule helps automate the handling of cost invoices by ensuring they are processed through the necessary approval stages without manual intervention, increasing the speed and accuracy of financial operations.
Control and Compliance: By requiring a second approval, the system enforces a control mechanism that ensures cost invoices are thoroughly reviewed, enhancing financial oversight.
Integration and Reporting: The export action suggests that once invoices are fully approved, they may be integrated into other systems for further processing or analysis, which is critical for financial reporting and audits.
This kind of rule is vital for organizations that deal with various types of invoices and need to ensure that each type is handled according to specific protocols. It reduces the risk of errors and ensures compliance with internal controls and external regulations.
This title suggests that the rule or condition being set up is designed to handle invoices where the total amount is less than or equal to a specified maximum amount.
When…
Document Type is Invoice: This condition checks if the document being processed is an invoice. This is crucial for ensuring that the rule only applies to invoices and not other types of documents.
And…
Document Status is Pending Approval: This specifies that the invoice must be in a "Pending Approval" status. This status check ensures that the rule applies only to invoices awaiting approval.
Compare two fields: Total Amount Less Or Equals Approver Max Amount: This condition compares the total amount of the invoice to an approver's maximum authorized amount. If the invoice's total amount is less than or equal to this max amount, the rule continues to the next step. This likely includes a tolerance level that allows for minor deviations within specified limits.
Assign user from field Approver Name, use user User as fallback: If the conditions specified are met, the invoice is automatically assigned to an approver whose name is specified in a field. If this field is empty or unavailable, a default user (likely an admin or another designated staff member) is assigned as a fallback to handle the approval.
Add Card: This button likely allows users to add more conditions or actions to the rule, enhancing the flexibility and specificity of the workflow.
Save: Saves the configured rule to the system.
This setup is designed to streamline the approval process for invoices by automatically directing invoices to the appropriate approver based on the amount and ensuring that only those within a certain threshold are handled in this automated way. It helps in managing financial controls and speeds up the workflow by reducing manual checks for each invoice.
\
This title indicates that the rule is set up to manage the second approval phase for purchase invoices with an emphasis on the quantity details, ensuring that the quantities on the invoice match those on the original purchase order.
When…
Document Type is Invoice: This condition ensures that the rule is activated only for documents identified as invoices, which is crucial for directing the workflow accurately.
And…
Document Status is Pending Second Approval: This specifies that the invoice is currently pending a second approval. This stage often provides additional oversight to ensure accuracy before the transaction is finalized.
Document Field Invoice Sub Type is Equals Purchase Invoice: This condition further specifies that the rule applies only to invoices categorized specifically as "Purchase Invoices," differentiating them from other types of invoices.
Logic Quantity in order confirmation Equals purchase order: This condition checks if the quantity listed in the order confirmation matches the quantity in the purchase order. It ensures that the invoice processing only moves forward if the quantities are consistent, which is critical for inventory management and financial accuracy.
Start Export: Once the invoice meets the specified conditions (i.e., the quantities match between the order confirmation and the purchase order), the action to "Start Export" is triggered. This likely involves exporting the invoice data for further processing, possibly to another financial system or for reporting purposes.
Ensure Accuracy and Consistency: By verifying that the quantities match between the order confirmation and the purchase order, the system helps maintain inventory accuracy and prevents discrepancies that could affect financial reporting or stock management.
Streamline Financial Processing: Automating the export of data once the quantities are confirmed reduces manual handling and speeds up the financial processing cycle.
Enhance Compliance and Oversight: Requiring a second approval for quantity verification adds an extra layer of oversight, crucial for compliance with financial policies and controls.
This rule is a clear example of how workflow automation can be effectively used to ensure precise and efficient handling of financial documents within an organization, particularly in the context of purchase processes that involve large volumes of transactions requiring meticulous validation.
This title indicates that the rule pertains specifically to handling purchase invoices during a secondary approval phase, with a focus on verifying the accuracy of the quantities listed.
When…
Document Type is Invoice: This condition ensures that the rule is activated only for documents classified as invoices. This is essential for maintaining specificity and relevance in the workflow.
And…
Document Status is Pending Second Approval: This specifies that the invoice is currently pending a second approval. This stage is typically intended to provide additional oversight before finalizing the invoice.
Document Field Invoice Sub Type is Equals Purchase Invoice: This condition further refines the rule to apply exclusively to invoices identified as "Purchase Invoices." This categorization helps differentiate them from other invoice types.
Logic Quantity in order confirmation Not Equals purchase order: This critical condition checks whether the quantity stated in the order confirmation matches the quantity on the original purchase order. The action is triggered if there is a discrepancy, indicating a potential error or issue that needs resolution.
Assign user from field Buyer Name, use user User as fallback: If the rule's conditions are met (i.e., there's a discrepancy in quantities), the invoice is automatically assigned to the person listed in the 'Buyer Name' field for further review. If this field is empty or the specified person is unavailable, a default user (likely an administrator or another designated staff member) takes over to ensure timely review and resolution.
Accuracy and Compliance: The rule is vital for ensuring that the invoicing process is accurate and aligns with the terms agreed upon in the purchase order. It helps prevent financial discrepancies and potential inventory errors.
Streamlined Approvals: Automating the review process for specific discrepancies helps streamline approvals and ensures that any issues are quickly addressed by the appropriate personnel.
Enhanced Financial Oversight: Requiring a secondary approval for quantity verifications strengthens financial controls and accountability within the organization.
This setup exemplifies how workflow automation can be utilized to enhance operational efficiency and ensure financial integrity, particularly in managing complex purchase processes within a company.
This title indicates that the rule is set up to manage the second approval phase of purchase invoices with a focus on the unit price, ensuring it matches the agreed terms.
When…
Document Type is Invoice: This condition ensures that the rule is activated only for documents identified as invoices, which is crucial for directing the workflow accurately.
And…
Document Status is Pending Second Approval: This specifies that the invoice is awaiting a second approval. This stage often provides additional oversight to ensure accuracy before finalizing the transaction.
Document Field Invoice Sub Type is Equals Purchase Invoice: This condition further specifies that the rule applies only to invoices categorized specifically as "Purchase Invoices," differentiating them from other types of invoices.
Logic Unit Price in order confirmation Equals purchase order: This condition checks if the unit price listed in the order confirmation matches the unit price in the purchase order. It ensures that the invoice processing only moves forward if there is consistency in pricing, which is critical for budgeting and financial reporting.
Start Export: Once the invoice meets the specified conditions (i.e., the unit prices match between the order confirmation and the purchase order), the action to "Start Export" is triggered. This likely involves exporting the invoice data for further processing, possibly to another financial system or for reporting purposes.
Ensure Accuracy and Consistency: By verifying that the unit prices match between the order confirmation and the purchase order, the system helps maintain financial accuracy and prevents overcharging or undercharging.
Streamline Financial Processing: Automating the export of data once the prices are confirmed reduces manual handling and speeds up the financial processing cycle.
Enhance Compliance and Oversight: Requiring a second approval for price verification adds an extra layer of oversight, which is crucial for compliance with financial policies and controls.
This rule is an example of how workflow automation can be effectively utilized to ensure precise and efficient handling of financial documents within an organization, particularly in the context of large volumes of transactions that require meticulous validation.
This title indicates that the rule is set up to manage the second approval phase of a purchase invoice, with a specific focus on validating the unit price.
When…
Document Type is Invoice: This condition ensures that the rule is triggered only for documents that are identified as invoices, filtering out other document types and maintaining the relevance of the workflow.
And…
Document Status is Pending Second Approval: This specifies that the invoice is in the phase where it is awaiting a second approval. This is usually a step designed to ensure additional oversight before final processing.
Document Field Invoice Sub Type is Equals Purchase Invoice: This further narrows down the application of this rule to only those invoices that are classified as "Purchase Invoices", distinguishing them from other invoice subtypes.
Logic Unit Price in order confirmation Not Equals purchase order: This logical check is crucial as it compares the unit price listed in the order confirmation against the unit price in the original purchase order. The action is triggered if these values do not match, which could indicate a discrepancy that needs resolution.
Assign user from field Buyer Name, use user User as fallback: If the conditions specified are met (i.e., there's a mismatch in unit prices), the invoice is automatically assigned to a buyer (the name specified in the 'Buyer Name' field) for further review. If the 'Buyer Name' field is empty or unspecified, a default user (likely an administrator or another designated staff member) is assigned as a fallback to handle the approval.
Ensure Accuracy and Compliance: This rule is critical in ensuring that the invoicing process is accurate and complies with agreed terms. By triggering a review when there is a discrepancy in unit prices, the system helps prevent financial errors or potential fraud.
Streamline Approvals: Automating the assignment for review based on specific discrepancies helps streamline the approval process and ensures that issues are promptly addressed by the appropriate personnel.
Financial Oversight: Requiring a second approval, especially based on price matching, reinforces financial controls and accountability within the organization.
Workflow Documentation
To keep an overview, you can give the workflows different headings so that you can immediately know what task this workflow is about.
Create a new Workflow: Click on + ADD WORKFLOW
You can use these workflows (Test 1,2,3) to automatically assign various documents to the right employee in the company.
If an invoice or other document exceeds a certain total amount that requires prior review and approval, these documents can be immediately assigned to the correct person.
Test 1: Logic Card
When: Assignee is: Amier Haider
And: Document type is: Invoice
Then: Assign document to: Stefan Reppermund
Test 2: Logic Card
When: Assignee is: Amier Haider
And: Document type is: Delivery Note
Then: Assign document to: James Edwards
Test 3: Logic Card
When: Assignee is: Amier Haider
And: Document type is: Order Confirmation
Then: Assign document to: Anian Sollinger
It is also possible, if the document is not assigned to a single person, to assign it to a specific employee from the start.
For an easier overview of what should happen to a document, you can set the status for incoming documents in this workflow. This workflow makes it possible to immediately see whether there is, for example, a pending approval.
Test 4: Logic Card
When: Document type is: Delivery Note
And: Assignee is: Amier Haider
Then: Change Status to: Pending Approval
Test 5: Logic Card
When: Document type is: Invoice
And: Assignee is: Stefan Reppermund
Then: Change Status to: Pending Second Approval
If an invoice or other document exceeds a certain total amount that requires prior review and approval, these documents can be assigned to the right person immediately.
Test 6: Logic Card
When: Assignee is: Amier Haider
And: Docfield total_amount is Greater than 500
Then: Assign document to: Asad Usman Khan
It is also possible to enter the status into the workflow, so the assigned person can immediately see what status this document is and what should happen next with it.
Test 7: Logic Card
When: Assignee is: Amier Haider
And: Docfield total_amount is Greater then 500
Then: Assign document to: Asad Usman Khan
Change Status to: Pending Approval
For example, if certain or important information is missing from a document, but is important and must be included for further processing, you can set up the workflow so that these documents are immediately forwarded to the buyer and a substitute (replacement).
Test 9:
The Workflow with these logic cards is designed to automatically verify that the quantity, unit price, or discount detailed in an order confirmation matches the corresponding figures in the purchase order. This verification ensures consistency and accuracy between what was ordered and what the supplier confirms to deliver.
You can give these documents a specific status or assign them to a specific employee.
Logic Card: Quantity or Unit Price or Discount Match
This logic card is designed to automatically verify that the quantity, unit price, or discount detailed in an order confirmation matches the corresponding figures in the purchase order. This verification ensures consistency and accuracy between what was ordered and what the supplier confirms to deliver.
Trigger Condition
The logic is activated when any of the following conditions are met in an order confirmation relative to the original purchase order:
Quantity: The quantity of items ordered matches the quantity confirmed by the supplier.
Unit Price: The price per item agreed upon matches the supplier's confirmation.
Discount: Any discounts applied are consistent between the purchase order and the order confirmation.
Define Comparison Parameters: Set up the specific fields (quantity, unit price, discount) that the logic card will check for a match.
Automate Verification: Configure the system to automatically compare these details upon receipt of an order confirmation.
Customize Alerts: Decide on the workflow for handling discrepancies, including customization of alerts for manual review.
This logic card is vital for ensuring that the details of an order confirmation align with the original purchase order, safeguarding the integrity of the procurement cycle.
Test 10:
If you have a different calculation for surcharges, or only have them on some items, you can use the generic table calculation cards, some of them also allow to filter for regular expressions.
Above is a calculation example for MTZ with a filter for item numbers starting with 01, 06, 9, 001 or 000.
With a manual setup it’s advised to split calculations that depend on new columns into a separate workflow. To continue with the calculation you can use the Run Workflow card.
Run Workflow
With this card you can specify the name of a workflow that is to be run after the current workflow if its conditions are met and after previous then cards of the current workflow. While it prioritises runnable, active workflows, it also allows you to run deactivated workflows if the document fulfills the workflows conditions.
If you want to add all surcharges as a negative discount into the discount column, you can use the calculation card. There might be entries in this column, you can set it as one of the variables on the card, have the MTZ subtracted from it and add the result back into this column. In case there are empty fields (surcharges only for some items) it will assume a 0 for its calculation
Notify user to authorize the order confirmation in DocBits
After calculating the surcharges you might want to notify a specific user to authorize the order confirmation. For this you can use the notification card
Depending on settings, the user gets assigned a new task in DocBits and optionally an email to notify them of their new task.
Explore the step-by-step workflow for material ordering, goods receipt, and invoice processing with INFOR ERP and DocBits integration. Effective supplier and document management in one.
This document outlines the workflow for ordering materials from suppliers, receiving goods, and processing invoices with integration between suppliers, INFOR ERP system, and DocBits for document management.
Ordering Materials
Initiate order with the supplier.
Send the order to the supplier through INFOR.
Receiving Order Confirmation
Supplier confirms the receipt of the order.
Creation and sending of order confirmation.
Goods Receipt and Inspection
Receive goods from the supplier.
Post goods receipt in INFOR and check against the delivery bill.
Invoice Processing
Receive invoice and send to DocBits for processing.
Check and verify invoice details with order and goods receipt.
Final Steps
Archive the order and related documents in DocBits.
Update INFOR with transaction details for financial accounting.
Does the received goods match the order?
Yes: Proceed with invoice processing.
No: Manual check and update required.
Is the invoice correct according to the goods received and order details?
Yes: Complete the transaction and update financial records.
No: Further examination and corrections needed.
INFOR ERP: Main system for order processing, goods receipt, and financial accounting.
DocBits: Document management for processing and archiving invoices and order confirmations.
Ensure all documents are verified and archived for record-keeping.
Discrepancies in order or invoice details must be resolved promptly to avoid delays.
If a customer requires a new document type or additional fields to be added to an existing document type layout, this section will go through all the information required to do so.
In DocBits you will find the SETTINGS menu in the upper bar on the DASHBOARD.
If you are logged in to DocBits as an admin, you will find all fields of a document that can be extracted under the respective document type.
Open the menu for Document Types.
In the following overview you will find all standard document types available for you
Activate/Extraction Type
To the right of each document type, you will see Activate and Extraction Type sliders.
Activate: This document type is active in your DocBits environment.
Extraction Type: This slider allows you to enable or disable a set of predefined rules for the document type when it is processed by DocBits. By selecting the gear icon to the right of the slider, the following menu will appear.
To see which fields can be extracted, for example from an invoice, click on FIELDS for this document type.
Field Settings
Here you will find all the fields that can be extracted
You can also CREATE FIELDS like freight, postage or any field with an amount you want to extract from your invoices.
For each field you can check the boxes if they are:
REQUIRED: Here you can define if the field must contain a value to continue.
READ ONLY: Here you can define if a field can only be displayed but not edited.
HIDDEN: Here you can define whether a field should be hidden or displayed in the extraction view.
FORCE VALIDATION: Here you can define whether a field must always be validated manually, even if it has been read 100% by DocBits.
OCR and MATCH SCORE: Setting as described below, per field.
FORMULA: Creation of a formula per field.
If all settings are made and should be saved, please confirm this with the SAVE SETTINGS button at the bottom of the page, otherwise the settings will not be applied.
Recognition Settings
OCR
Here you can set the sensitivity of the OCR (Optical Character Recognition) function for all fields at once. This value determines the sensitivity with which a field is marked in red if it could not be extracted with 100% certainty (OCR related!).
Match Score
This is where you can set the sensitivity of the MATCH SCORE function for all fields at once. This value determines when a field is marked in red if DocBits has not extracted the field with 100% probability. In this case the field needs to be validated manually.
The button RESTORE DEFAULTS will set back both values to “50”.
Profile
Here you can define the profile that shall be used. Either Default or ZUGFeRD. In profile ZUGFeRD there are predefined fields that are mandatory for this type of invoice. If you do not explicitly use ZUGFeRD, please select “Default”.
Format: JSON
Purpose: This step involves defining the structure of the EDI data. It includes specifying segments such as SAC
, N1
, and PO1
, and details the fields contained within each segment. For segments that contain nested structures, loops are defined to properly organize the data hierarchy.
Format: XSLT
Purpose: This step involves transforming the structured JSON data into a structured XML format, specifically tailoring the output to meet the requirements for further processing or integration. This transformation helps in extracting precise information like acknowledgement types, order details, and conditional elements based on specific values.
Format: XSLT (outputting HTML)
Purpose: Converts the XML data from Step 2 into an HTML format for previewing the transformed data in a readable and visually appealing format. The HTML layout includes styles for presentation and structures data like purchase orders, supplier details, and order terms for easy viewing.
Format: JSON
Purpose: Specifies JSON paths for extracting key values from the XML data produced in Step 2. These paths are used to retrieve specific data points such as purchase orders and currency, which are crucial for downstream processing and integration into other systems.
This updated sequence ensures a thorough process, transforming raw EDI data into structured, actionable information using JSON for data structuring, XSLT for transformation and HTML preview, followed by JSON paths for data extraction and integration.
First of all, ensure that the Layout Builder feature is activated. This can be done by navigating to Settings → Document Processing → Module → Document Type and ensure that the Layout Builder slider is set too active as shown below.
After this is done you can access the Layout Builder via Settings → Document Types, once on this page, you can select from the various document types you have created and either select “Edit Layout” as shown below
or if you have sub-document types within a created document type you can select “Document Sub Types” and select “Edit Layout” for the sub document type layout you wish to edit as shown below.
After following the previous steps you will reach a page like the one shown below.
In order to upload a document to the layout builder, simply navigate to the right on the screen
Click on the “Upload Documents” button or drag and drop your desired document into the provided area
Groups can be created by selecting the following icon.
Groups allow you to create different sections on a layout, this makes it easier to separate different groups of data or information to make a layout easier to follow. You can create a title for each group so that a user can know what information they will find in that group.
These are a set of default fields that can be dragged and dropped into the layout builder and are available to you to create your desired layout. These include:
Text – This is a text box which creates a field in the layout that can have text entered into it once on the validation screen.
Label – This is a field that can be used to create uneditable text, this could be used to create sub-headings or any other desired uneditable text when on the validation screen.
Checkbox – This creates a boolean type field which can be checked or unchecked.
Multi Checkbox – This functions the same way as the “Checkbox” but can be used when the user knows they will be adding multiple checkboxes in one section.
Horizontal Separator – This creates a horizontal line on the layout that can be used to split up sections within a group on the layout.
Table of Checkboxes – This lets the user create a table of checkboxes consisting of custom x- and y-axis values, eg.
Button – This creates a clickable button on the validation screen within the layout that can be set to one of three functions, including: Export, Export mit Sonderwunsch or Reject.
Extracted Tables -This allows you to place an area on the document layout that illustrates the table that gets extracted from the document. For information click here.
Invoice Buttons – This element lets you drag and drop a set of buttons that are optimized for invoices. When on the validation screen, when you select the invoice type (either cost or purchase) the PO Matching or Auto Accounting will disappear accordingly.
QR Code Fields – This element allows you to drag and drop a block that will display all the extracted information from a document when a QR code is present.
The user is able to create their own custom groups and fields for a document type, this can be done when originally creating a document type but also by selecting “Fields” when on the Document Types page in Settings.
In order to create the above space on the layout, a “Label” from the Form Elements must be used in a special way. The reason for this is that the Layout Manager operates according to a 100 space per line system in that 1 space represents 1 percent of a line, this means that fields can only take up 100 spaces per line as show below.
This means that the user must build the layout line by line according to this rule. For example let's say you would like to add the fields “Name” and “Date” in the same line but would like the “Name” field to be larger. This can be done by dragging and dropping the “Text” field from the Field Elements drop down and naming each field “Name” and “Date” as shown.
The problem now exists that they are both the same size of 33 (this is the default size of all dragged and dropped fields) but you would like the “Name” field to be larger than the “Date” field and both fields should take up the entire line on the layout. Therefore, by following the 100 percent rule, you can set the “Name” and “Date” fields to any combination of 100 that you would desire. This of course depends on how large you would like each individual field but for the purpose of this example we will set the “Name” field to 70 and the “Date” field to 30, the results are:
This same rule applies to all fields in the Layout Builder.
Now that this rule has been explained, creating blank spaces will make more sense. As previously mentioned, in order to create a blank space you have to use a “Label” from the Form Elements.
For example, let’s say that you would like to create a blank space between these two fields.
Step one is to drag and drop a “Label” between these two fields, once added you can click on the “Label” field you just added and on the left you will be presented with the properties of the field. Now, in the same way you would create or change the name of a field as shown previously, you will remove any name from the “Label” property like so
The result from doing this will then be
There is now a gap between the two fields. This gap can be extended or shortened according to the 100 percent rule discussed earlier, and with these functions you can create any desired layout.
DocBits excels in adapting document layouts according to their geographical origins while standardizing elements like currency formats based on user browser settings. Let’s explore how you can leverage the Layout Builder to customize layouts for different origins, such as the U.S. and Germany.
Currency and Format Standardization: Regardless of the original document’s currency or format, DocBits converts these elements into a standardized ISO format on the server, in line with the user’s browser settings.
Geographical Layout Customization: The system allows customization of document layouts based on their geographical origin. This means you can define specific fields and formats for documents from different countries.
U.S. Layout: For a U.S. invoice, you might include fields for city tax, aligning with the common tax structure in the U.S.
Germany Layout: In contrast, a German invoice layout may omit the city tax field, as it’s not a standard charge in Germany.
Select Origin Layout: In the Layout Builder, choose the base layout corresponding to the document’s origin.
Customize Fields: Adapt the layout by adding or removing fields. For instance, include ‘City Tax’ for a U.S. layout.
Apply and Test: Once customized, apply the layout to your documents and test to ensure accuracy.
Understand Regional Differences: Familiarize yourself with the tax and format nuances of different regions.
Consistent Updates: Regularly update your layouts to reflect any changes in regional regulations.
User Feedback: Utilize feedback from users in different regions to refine layouts further.
This feature by DocBits gives you an alternative to model classification as it allows you to write searchable regular expressions for a document type for classification and other purposes.
Document Type: The Regex Manager allows you to write regular expressions and this regex will then be searched for in the document, if it finds a match to the regex of a defined document, it then classifies that document to the corresponding document type. For example, if you wrote a regular expression to find “Gutschrift”. If DocBits found this term in a document it would classify that document as a credit note.
Document Origin: This lets DocBits know the country of origin of a document through regular expressions. For example, if a regular expression for a Spanish document contains the term “Factura”. If DocBits searches a document and finds this term then it would know that the document is of Spanish origin and classify it as such.
To find this feature in DocBits, from your Dashboard, navigate to Settings → Global Settings → Document Types. Within each of the created document types, there is a “Regex” option.
By clicking on “Regex” you will be taken to this menu, which displays the existing regex models that have been created as well as an “ADD” button for you to create new regex models.
A guide to importing master data from INFOR LN.
Functioning LN to DocBits dataflow
Correctly configured DocBits environment
In Infor, open the ION Desk application. In the left tab, go to Connect → Connection Points
This is where you will create the two connection points needed to import your data from LN that is required for Auto Accounting.
Click on “+ADD” to create a new connection point, select the API option like below
You will need to configure two separate API connection points, namely:
ChartOfAccounts
FinalFlexDimensions
The connection tab for your ChartOfAccounts connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add two BODs in this section for this connection point. These being Sync.ChartOfAccounts and Sync.CodeDefinition, to add these BODs do the following:
Sync.ChartOfAccounts
Click on the PLUS (+) icon
Select “Send to API”
Search for the Sync.ChartOfAccounts BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
At product, select the API Endpoint that you created for the environment you are working with, which you created in ION API. Search for the following API call, select the API call and press OK.
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.ChartOfAccounts BOD. Click on the PLUS icon to add the next and final BOD.
Sync.CodeDefinition (TotalFlexDimensions)
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the TotalFlexDimensions master data table.
The connection tab for your FinalFlexDimensions connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add one BOD in this section for this connection point. This being the Sync.CodeDefinition, to add this BOD do the following:
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available at https://docbits.com/doc/field-mappings/.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the FinalFlexDimensions master data table.
You will need to configure two separate data flows for Auto Accounting:
ChartOfAccounts
FinalFlexDimensions
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the LN company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following documents need to be added to the data flow:
Sync.ChartOfAccounts
Sync.CodeDefinition
This is where you add the ChartOfAccounts API connection point which you created earlier, the configuration for this should look similar to this
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the LN company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following document needs to be added to the data flow:
Sync.CodeDefinition
DocBits (FlexDimensions)
This is where you add the FinalFlexDimensions API connection point which you created earlier, the configuration for this should look similar to this
Once all the above is completed, you will need to navigate to Infor LN and trigger the BODs in order for the various master data you need for Auto Accounting to arrive in DocBits.
From the above menu, in the left menu tab, select Common → BOD-Messaging → Publish BODs → Publish Financial Master Data. From the following menu you will find the FlexDimensions and ChartOfAccounts BODs to publish.
Select the following BODs to publish by simply checking each box, no other changes need to be made as we want to publish all of these BODs so that the master data is complete in DocBits.
Once both of the above BODs are selected, navigate to the Options tab
Once on the Option menu, select the following options and select PROCESS to publish the BODs.
Once this is done you should see the three separate master data tables in your DocBits environment(s) under Master Data Lookup:
chartofaccounts
totalflexdimensions
finalflexdimensions
You will need to create the DocBits API connection point in order to create the data flow later.
First, in InforOS, navigate to ION Desk → Connect → Connection Points
Once here, you will need to create a new connection point.
Select API
Give the connection point a name and description that describes its nature and its environment. Under the Connection tab, import the service account you created for the environment you are working with.
Next, switch to the Documents tab. You will need to add the following BODs to the connection point.
Ack-SupplierInvoice
This BOD is used to signal on DocBits that an error has occurred within Infor. The configuration for these two BODs should look similar to the following (API Call Name changing for each)
Sync.PurchaseOrder
The configuration for this BOD should look similar to the following
Sync.ReceiveDelivery
The configuration for this BOD should look similar to the following
Once these BODs are configured, you can save the connection point by pressing the icon located right to the back button.
The data flow will look similar to the following
(The reason for multiple DocBits APIs is due to each connection representing a different environment meaning, depending on the amount of environments you have, your data flow could differ slightly)
For the purpose of this explanation we will use the example of having four separate environments.
The start of the data flow consists of your LN application
Here you will add an application and select the DocBits API(s) you created earlier
The configuration should look as follows
Once all the above is completed, you will need to navigate to Infor LN and trigger the BODs in order for the various master data you need for Suppliers and Purchase Orders to arrive in DocBits.
From the above menu, in the left menu tab, select Common → BOD-Messaging → Publish BODs → Publish Order Management Transactional Data
Select the PurchaseOrder tab and check the box.
From the LN homepage, in the left menu tab, navigate to Common → BOD-Messaging → Publish BODs → Publish Logistics Master Data
Select the PartyMaster tab and check the Supplier → Buy-from or SupplierPartyMaster box.
Once all the correct BODs have been checked for publication, select the Options tab.
The following options should be selected.
Once this is complete, press the PROCESS button and the BODs will be triggered. A message will appear on screen to notify you that the BODs have been triggered.
If done successfully, the Supplier and Purchase Order tables should now be available under Settings → Master Data Lookup.
\
Open ION Desk → Connect → Connection Points
You will need 4 connection points for this dataflow, 3 API connection points for the different tax code categories (full, reduced and free) and an Application connection point representing your LN company.
In order to create new connection points, select the “+ADD” button”
Select “API” at the bottom of the list of options
You will be taken to the following page
This is where you will enter all the details of the TaxCode connection point. For each of the three connection points you will be creating do the following
Enter a Name: TaxCodeFull, TaxCodeReduced, TaxCodeFree
Description: This can be the same as the Name or similar
Import a service account you created.
Switch to the “Documents” tab and select the PLUS icon to add the BOD we need, like below
Search for the BOD
Search for the BOD called “Sync.LnTaxCode”, click on it and press “OK” to add the BOD.
Move on to the ION API section. Under API Call Name you can use the name of the BOD, Sync.LnTaxCode
Press the “SELECT” button
Select the API you configured for the environment you are working with and search for the following API. Once you have selected it, press “OK”.
Next, switch to the Request Body tab.
Here is where there will be a slight change for each connection point, this is seen in the field mappings you will assign to each tax code as they differ slightly.
In the field_mappings row, under value, is where you will put the specific field mappings for the specific tax code connection point you are creating (full, reduced or free). These mappings are available at https://docbits.com/doc/field-mappings/.
The end result should look the same or similar to the image above. Once this is done, click the SAVE option located here.
Navigate to ION Desk → Connect → Data Flows
Click on “+ADD” and select document flow
Create the following data flow by dragging and dropping the components from the to menu
Here is where you will select your LN company, the final result should look similar to the following
This is where you will add the Sync.LnTaxCode BOD from earlier, the result looks as follows
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
This is where you select the API connection points you created earlier, this is done by selecting the API under the “Select ION API Connector” drop down menu.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
The Name and Description will depend on the environment you are using and your preferences.
Once all the above is done, SAVE and ACTIVATE the data flow by pressing the following buttons
Open LN in Infor
Navigate to Common → BOD Messaging → Publish BODs → Publish Financial Master Data
Select MORE and click on LnTaxCode
Tick the checkbox to select the LnTaxCode BODs
Navigate back to the OPTIONS tab, your configuration should look as follows
When you would like to publish the BODs, select PROCESS.
The end result should give you a similar table in your DocBits environment.
A guide to importing master data from INFOR M3.
Functioning M3 to DocBits dataflow
Correctly configured DocBits environment
In Infor, open the ION Desk application. In the left tab, go to Connect → Connection Points
This is where you will create the connection point needed to import your data from M3 that is required for Auto Accounting.
Click on “+ADD” to create a new connection point, select the API option like below
You will need to configure the API connection point called:
ChartOfAccounts
The connection tab for your ChartOfAccounts connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
You will need to add two BODs in this section for this connection point. These being Sync.ChartOfAccounts and Sync.CodeDefinition, to add these BODs do the following:
Click on the PLUS (+) icon
Select “Send to API”
Search for the Sync.ChartOfAccounts BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
At product, select the API Endpoint that you created for the environment you are working with, which you created in ION API. Search for the following API call, select the API call and press OK.
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available here.
Once you have completed the above steps, you will have successfully configured the Sync.ChartOfAccounts BOD. Click on the PLUS icon to add the next and final BOD.
The connection tab for your CodeDefinition connection point should look similar to what is illustrated below. Give the connection point an appropriate name and description, as well as import the Service Account you created.
Select “Send to API”
Search for the Sync.CodeDefinition BOD
Switch to the ION API tab, copy the API name and search for the API Call by pressing the SELECT button
Next, switch the Request Body tab
Here is where you will configure the field mapping for this BOD, your configuration should look like the following. The field mappings are available here.
Once you have completed the above steps, you will have successfully configured the Sync.CodeDefinition BOD for the M3FlexDimensions master data table.
You will need to configure the following data flow for Auto Accounting:
ChartOfAccounts
An overview of this data flow looks as shown below (the amount of DocBits API connection points at the end depends on the amount of different environments you are configuring).
The configuration for this connection point depends on the M3 company which contains the master data you wish to import into DocBits, yours should look similar to what is shown below.
The following documents need to be added to the data flow:
Sync.ChartOfAccounts
Sync.CodeDefinition
For the second route of the dataflow (according to the routing in the data flow), we apply a filter with the following configuration.
This is where you add the ChartOfAccounts API connection point which you created earlier, the configuration for this should look similar to this
Once all the above is completed, you will need to navigate to Infor M3 and trigger the BODs in order for the various master data you need for Auto Accounting to arrive in DocBits.
Start by pressing Command + r, to open the prompt menu, type “evs006” and press enter.
The following menu will be displayed to you
To add the various BODs you will need to enter the BOD nouns and Table names for each BOD individually.
The BODs you need to add include:
ChartOfAccounts
CodeDefinition
CodeDefinitionAccountingDimension
To add the new BOD after, after entering the BOD Noun and Table Name, press the pLUS icon indicated below
The BOD nouns and Table names are as follows.
ChartOfAccounts
BOD Noun: ChartOfAccounts
Table Name: FCHACC
CodeDefinition
BOD Noun: CodeDefinitionAccountingDimension
Table Name: FCHACC
After adding each BOD, right click on the BOD you added, select Related and then Run.
You will be taken to this screen.
Change BOD Verb to “sync” and press NEXT.
Once you press NEXT, you will get a notification indicating that the BOD publishing process has begun.
In order to import the m3costingelement table into DocBits, you need to do the following.
From the M3 Homepage, type Command + r and search the “PPS280” prompt.
Select any of the lines displayed to you. On the next menu, select TOOLS and “Export to Excel”
Select “Export all Rows” and then press EXPORT.
Once downloaded, you will need to alter the excel file before converting it into a CSV file.
You will need to open the excel file, it will look similar to what is shown below.
From this excel sheet you only need the first 2 columns, alter the excel sheet so that the end result looks as follows.
Once this is done, save the file as a CSV.
Once you have your CSV file, go to the following webpage. This depends on which environment you are using:
Prod: http://api.docbits.com/
Sandbox: http://sandbox.api.docbits.com/
Stage: http://stage.api.docbits.com/
Demo: http://demo.api.docbits.com/
Dev: http://dev.api.docbits.com/
Here you will manually upload the CostingElement table via an API. Click on the Authorise button.
Here you will need to insert the API Key from your DocBits environment. This is located in Settings under Integration.
Once complete, search for the API called master_data_lookup/import_data and fill in the required information. Once complete, click EXECUTE to trigger the API.
If done correctly, the M3CostingElement table should now be in your DocBits environment. Auto Accounting for M3 has now been configured for your environment.
The export module is located, on the Dashboard, under Settings → Document Processing → Export.
To add a new export configuration, select “+ New”
Select the method you would like to use for your export configuration.
Once you have selected the method you would like to use, you will need to upload the various information and files required for that method of exporting.
Once you have one or many export configurations in your DocBits, you have the option to activate or deactivate configurations depending on your needs.
The configuration below is activated, indicated by the green dot to the left of the configuration name.
To deactivate the export configuration, select the options button to the right of the configuration as shown below.
You are given three options
Deactivate: The export configuration will no longer be functional (indicated by a red dot next to the configuration name).
Edit: Make changes to the details of the configuration.
Delete: Delete the configuration.
A guide to exporting documents in DocBits.
You will need to create the DocBits API connection point in order to create the data flow later.
In InforOS, navigate to ION Desk → Connect → Connection Points
Once here, you will need to create a new connection point.
Select API
Give the connection point a name and description that describes its nature and its environment. Under the Connection tab, import the service account you created for the environment you are working with.
Next, switch to the Documents tab. You will need to add the following BODs to the connection point, not all are necessary for the supplier and purchase order master data but will be useful when other features such as Auto Accounting need to be implemented.
For now we will only focus on the necessary BODs, these being: Sync.RemitToPartyMasterData, Sync.SupplierPartyMaster and Sync.PurchaseOrder.
Sync.RemitToPartyMasterData and Sync.SupplierPartyMaster
The configuration for these two BODs should look similar to the following (API Call Name changing for each)
Sync.PurchaseOrder
The configuration for this BOD should look similar to the following
Once these BODs are configured, you can save the connection point by pressing the icon located right to the back button.
The data flow will look similar to the following
(The reason for multiple DocBits APIs is due to each connection representing a different environment meaning, depending on the amount of environments you have, your data flow could differ slightly)
For the purpose of this explanation we will use the example of having four separate environments.
The start of the data flow consists of your M3 application
Configuration of the filter looks as follows
(The accounting entity ID of course being unique to your organization)
Here you will add an application and select the DocBits API(s) you created earlier
The configuration should look as follows
Navigate to the Infor M3 application
Once at the main menu, type Command + R to open the command prompt search box. Then type evs006 and search.
Once on this page, you will need to add the SupplierPartyMaster, RemitToPartyMaster and PurchaseOrder to the list.
BOD noun: SupplierPartyMaster
Table: CIDMAS
BOD noun: RemitToPartyMaster
Table: CIDMAS
BOD noun: PurchaseOrder
Table: MPHEAD
For each case you will need to press the plus icon to add them to list.
After you have added each of the BODs, right click on the BOD noun of the BOD and select Related → Run
You will be taken to the following menu, where you will need to change BOD verb to Sync and then press NEXT to trigger the BODs.
Once you trigger the BODs, you will get a notification confirming this.
If done successfully, the Supplier and Purchase Order tables should now be available under Settings → Master Data Lookup.
Download a BOD Mapping File and open it in your applicable file editor of choice to edit it. For this walkthrough, VSCode is used.
Change the company to the correct one (SFV_AccountingEntityID) and edit location ID if needed.
Check the document code by going to the field settings of the document type you are trying to export (found in URL of field settings of document type in DocBits like below
Lastly, edit the SFV_LogicalID which can be found in INFOR ION DESK → Connect → Connection points and select the DocBits_Export or similar connection point and within that page you will find the Logical ID you need.
If this Connection Point does not yet exist, you need to create one.
First, go to ION Desk → Connect → Connection Points and click on the “+ Add” button”
Then select the “IMS via API Gateway” option
You will be taken to the above screen where you must now fill in the necessary information, the name should be something like “DocBits_export” or similar.
For “ION API Client ID” you enter the same Client ID you obtained earlier for the ION Mapping File.
Then select the Document tab of the Connection Point creation menu and add the following documents by pressing the “+” sign, this will only become useful later.
Once you save this Connection Point you will obtain the Logical ID as shown below
Then insert this Logical ID into the appropriate section of the BOD Mapping File and save the file.
Drag and drop the file into your export configuration in DocBits. This is available at Settings → Export.
Once obtained, open the file in your applicable file editor of choice. For this walkthrough, VSCode will be used.
Check the document type code is as it is in DocBits (like with the BOD Mapping File it should match the name of the doc type in the URL of the field settings) and also check the name of the document type as it should be in Document Manager (IDM) in Infor.
FYI: It states that the name of the document type in IDM is M3_SupplierInvoice, this is due to this being an example from an M3 instance. This can change depending on if you use LN or M3, as well as your specific IDM configuration.
Check the company ID, and check Entity ID (SF_MDS_EntityType) this value should be the same as it was in the BOD Mapping File.
Ensure the IndexFieldFromDocBits=IDMAttributeID (check if DocBits on the left in the field settings matches IDM on the right in Document Type → Attributes).
Go to Document Manager and select the name of the current document type you are trying to export, for example, Supplier Invoice.
Click the above icon and then click Administration → Document Type and then find the document type you need in the list
As shown below, you will then see the doc type name as it is in INFOR
Make sure this is how the name is shown in the IDM Mapping File
Once the file is prepared, upload it to your export configuration in DocBits. This is available at Settings → Export.
Once at the home screen, click on the burger menu and select ION API
After opening ION API, click on Available APIs in the left menu
Click on “+ADD” block
Then “+ Create New”
The information you insert should look like this
FYI: The description has multiple environments as this will be used for multiple environments and the icon and its color always remain the same.
Next, select the + at the bottom of the screen
This Target Endpoint URL can be found at doc2api.cloudintegration.eu
The information underneath this field should look as follows.
Once you have filled in this information, to the right of these fields there is a “Target Endpoint Security” field with a drop down. Select API Key from this drop down.
A table will then appear underneath this drop down, fill in the following information. The key value is specific to the customer and environment, it can be found within DocBits.
From the Dashboard → Settings → Integration → API Key
Copy and paste this into the Key Value field in InforOS
Once this is complete, press the following icon to save the configuration
You are not yet completely done with the configuration.
Go back into the API you just configured and enter the details like below
Go to the Documentation tab at the bottom and click on the +
Enter the following details:
Name = DocBits-”environment”
Type = Swagger
URL = go to doc2api.cloudintegration.eu, once on this page, open the following link
Copy the URL and use it for the URL field in InforOS
Save it once you have entered the information for all the fields. There should be a loading icon for a while but the end result should look like this
The same process would be used to create the endpoints for other environments.
FYI: If in the future you are struggling to find these endpoints, in ION API go to API Metadata and click on this icon to refresh the API metadata.
From the DocBits Dashboard of the required customer, go to Settings → Export, to add a new document type for export, do as follows:
Click on the “+ New” button
Select “Infor IDM + ION BOD”
You will then be taken to this menu where you need to give the new exportable document type a Title, select the document type from the dropdown and add all the necessary Mapping Files (ION, IDM and BOD).
This is created in INFOR ION API → Authorized Apps and an app like below should be shown
If not, then you need to create a new Authorized App. This can be done clicking the plus sign
Once you have entered the Name (DocBits_*Environment*) and Type (case specific) of the new Authorized App, you will be taken to a page where the Client ID and Secret have been generated automatically.
The information you fill in should be similar to what is shown above, it is important to enable the “Issue Refresh Tokens” slider at the bottom of the page.
Click “Download Credentials” to download the ION Mapping File.
Once you have downloaded the ION API file from Infor, you can upload it by going to Settings → Document Processing → Export like below
\
M3 export mapping file is divided in 5 sections and each section is further divided into 2 sections
Header
Header Static Fields
Header Fields
Tax Lines
Tax Line Static Fields
Tax Line Fields
Receipt Lines
Receipt Line Static Fields
Receipt Line Fields
Order Charge Lines (Additional Amounts)
Order Charge Static Fields
Order Charge Fields
Cost Lines
Cost Line Static Fields
Cost Line Fields
Adding New Field:
First we need to add the M3 api field name to the relevant section’s fields list property (e.g. StaticFields, HeaderFields, InvoiceTaxFields)
Define the static value or document field name for the api field with appropriate prefix for the section
Example 1: To define a static value of AAA for the M3 api field DIVI. First we added DIVI to StaticFields property. Then we add a line SF_DIVI = AAA as SF_ is the prefix for static fields
Example2: To map header field IVDT (invoice data) to invoice_date field of DocBits. First we add IVDT to HeaderFields property. Then we add a line HF_IVDT = invoice_date as HF_ is the prefix for header fields
Removing Field:
Just remove the field from section’s field list property and remove the line defining value for the field.
Available M3 fields can be checked by opening appropriate screen in M3.
Similarly you can get field names for lines
Fields List Property: StaticFields
Section Fields Prefix: SF_
Available Fields: You can map any M3 api field with any static value
Fields List Property: HeaderFields
Section Fields Prefix: HF_
Available Fields: You can map any DocBits field to any M3 api field
Fields List Property: InvoiceTaxStaticFields
Section Fields Prefix: IT_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceTaxFields
M3 Fields Prefix: ITF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: line_number, tax_amount, tax_rate, net_amount, gross_amount, tax_code_full, tax_code, tax_country
Fields List Property: InvoiceReceiptStaticFields
Section Fields Prefix: IR_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceReceiptFields
M3 Fields Prefix: IRF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: packing_slip, purchase_order, line_number, line_sequence, delivery_number, delivery_line, amount, quantity, total_net_amount
Fields List Property: OrderChargeStaticFields
Section Fields Prefix: OC_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: OrderChargeFields
M3 Fields Prefix: OCF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: ledger_account, dimension_2-7, amount, quantity, quantity2, position
Fields List Property: InvoiceCostStaticFields
Section Fields Prefix: IC_SF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: You can put any value as they are static fields
Fields List Property: InvoiceCostFields
M3 Fields Prefix: ICF_
DocBits Table Field Prefix: TF_
Available M3 Fields: Please check M3 API or UI
Available DocBits Fields: ledger_account, dimension_1-12, amount, quantity, quantity2, position
New update
If you only require your document to be exported to IDM or Document Management in INFOR, the configuration is similar to that of the export to IDM + LN/M3 but does not require any BOD Mapping file as there is no export to LN/M3 required.
Select the following option for exporting.
You will be required to upload an ION API file as well as an IDM Mapping file.
How to obtain these has been discussed earlier in this documentation. ION API file and IDM Mapping file.
You have created:
An ION API Endpoint
An ION API File
A BOD Mapping File
An IDM Mapping File
Before you set up the data flow, you need to import the mapping files into InforOS
In ION Desk → Connect and open Mappings
Click on the Import icon
From here you need to select the various mapping files you will need which include: SyncCaptDoc_SyncSuppInv, SyncSupplierInvoice_LoadSupplierInvoice, and LoadSupplierInvoice_ProcessSupplierInvoice.
Once you have imported all the mappings files, make sure to approve each of them by clicking the tick icon within each of their squares on the Mapping dashboard.
The next step is to setup the Data Flow in ION Desk, navigate to the ION Desk application and select Data Flow → + ADD → Document Flow like below
You will then see this page, this is where you will build the flow of information from DocBits to M3
An LN data flow will look similar to what is shown below (there are multiple paths due to each individual path being meant for a specific document type, for this explanation we will focus on the invoice data flow).
All parts of the chain are dragged and dropped from the top section
In the chain, DocBits and LN are both Applications whereas in between them there are mappings that convert the data into a form that can be understood by the next section of the dataflow and “map” the information so that it goes to where it is needed or meant too.
Give it the appropriate name such as “DocBits” then select the plus sign and search for the connection point you created earlier such as DocBits_Export or similar and click on it.
To create this connection point, go to ION Desk → Connect → Connection Points
Click “+ Add”
Select “IMS via API Gateway” and fill in the following information
The ION API Client ID is in the ION API File you created at How to Create an ION API File under the “ci” value.
Switch to the document tab, and add the Sync.CaptureDocument BOD to the DocBits connection point like below.
Then save the connection point by pressing the disk icon in the upper-left corner.
Navigate back to the Dataflow section of ION Desk to access your dataflow. Your DocBits application should look similar to what is shown below.
The first mapping node should look as follows
The second mapping node should look as follows
There should already be an LN or similarly named connection point (for the appropriate LN company) created in INFOR so, just like the DocBits Application you select it by clicking the “+” sign and it should look as follows
The following configurations should look as follows:
The last icon should be empty as it is not carrying any document or information.
Once you have added all necessary nodes to the data flow, press this button to activate the data flow