Dec 29, 2021

Edit SAP Finance Accounting period (OB52) in DEV and QA

Many, many times over the years during my numerous SAP implementation projects I faced this constraint that in non-Productive SAP environments like DEV or QA; the Accounting Period (Transaction OB52) cannot be edited in DEV or QA and those records have to be transported.

Similar to some other tables like Currency rates (TCURR), the Accounting period; is part of a group of tables that are denominated "current settings". This means that even though these Tables are transportable, once you are in Production, these tables content will be allowed to be maintained directly without the need for a new configuration transport all the way from DEV.

Of course this would not make any sense. These tables like Accounting period, Exchange rates, Costing Sheet rates and many more, are key and central to the Financial operations. How would you be able to operate if any time you need to change these tables, you would need a new configuration transport. It would be insane !

For this reason, tables that are part of "current settings" can be updated in Production directly.

But now ... during an SAP Implemenation project, us Consultants and people doing testing in DEV and QA would also need to maintain and change values in these tables. You will need new exchange rates and to close and open your Accounting periods; among many other things. But if you try to do this in an out of the box DEV or QA environment, you will not be able to do it because you cannot do configuration directly in those environments. You will need to do it via a Transport request.

But it does not make any sense either. For this, with a simple small change in the configuration of the Table, you will be able to allow changes in DEV and QA without the need for a transport.

** Attention: this is a Table change, it will require a Workbench transport. If you are not sure what you are doing, please get it done by an Abap developer or Basis Consultant that will understand what he is doing.

Access Tcode: SE54 

Table/View: V_T001B_COFIB (for the OB52 Accounting period table. This is your table in case you are in S/4 HANA with New/GL active).


Select "Generated Objects" then "Create / Change"


Once in this screen change from "Standard recording routine" to "no, or user, recording routine". Then save it and assign a Transport number.


Source for this Blog post was: OSS Note "2269677 - OB52 TK430 Client XXX has status 'not modifiable' in test system"


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Dec 5, 2021

New S/4 HANA 2020 Bank Sub-Ledger functionality config for Electronic Bank Statement

 In S/4 HANA 2020, SAP has released a new functionality called "Bank Sub-Ledger" that will allow companies with a large number of Bank Accounts to reduce their number GL Accounts in their Chart of Accounts significantly and manage them with just a few GL accounts.

Traditionally, in SAP we would have one GL Main Bank Account representing each House Bank and Bank account number (Account ID) combination that you have open with a given bank, plus a series of outgoing and incoming Bank clearing accounts for this combination. These accounts represent your issued checks not cashed by a vendor or Payments that you have received that you have not deposited yet or cleared with the bank. Depending on your business requirements, the different payments methods you have and other situations, you could end up with 4, 5 or 6 of these Bank clearing accounts. We also called them bank sub-accounts. Finally if your account is held in foreign currency, you will also need a Bank adjustment GL Account to post your foreign currency valuation at Month End.

It will look like this

105100 Bank Main account (House Bank)
    105101 Bank outgoing checks
    105102 Bank Incoming payments
    105103 Bank Lockbox clearing
    105104 Bank outgoing electronic payments
    105109 Bank adjustment account


At the end, you will have something that looks similar to this for each bank account that you have on a given Company Code. You might not have Lockbox or checks in some regions, but you will need several of this accounts to have a proper design and flexibility.

Now ... if your Company (Company Code) has many Bank accounts scattered all over the place among different banks you will end up with a large number of GL accounts within your Chart of Accounts dedicated only for your Banks. Total number of Bank Accounts x6. If you have 10 Bank Accounts, you will end up with 60 G/L Accounts only dedicated to this.

With the new S/4 HANA 2020 Bank Sub-Ledger Account, you can reduce this from 60 GL accounts to GL 6 accounts.

There are already several people that have done Blogs about how to configure and use this functionality  in SCN and other sites as well. But ... Nobody has posted anything yet on how to configure the Electronic Bank Statement for this Bank Sub-Ledger functionality. 

The traditional way of configuring the EBS when assigning GL Accounts to Accounting symbols and using the masking approach, WILL NOT WORK ! It will give you an error when you try to process an EBS file since the system is not able to do the account determination correctly. Your EBS posting rules will just fail.

This was the way to configure your Account assignment to your Account Symbols in the past when we did not have the Bank Sub-Ledger.


As I mentioned, if you do it like this; the system will not find the GL Accounts, the Account determination will fail and you will not be able to do any postings.

Now with the New Bank Sub-Ledger functionality, instead of doing the traditional masking setup, you need to configure it by putting the full GL Account number instead of masking the beginning of the string.

Configuration for EBS with Bank Sub-Leger 


Once you set it up like this, the system will start doing the Account determination properly once again.

A few notes:

#1 - There is a possibility to do a Migration from the old way to the new Bank Sub-Ledger. So, if you are Migrating to Bank Sub-Ledger, you will also need to adjust your EBS configuration. Because, most likely the migration program will not migrate and adapt this config it just changes the GL Master Data setup and other transactional tables.

#2 - OSS Note 3007105 from SAP has stated some restrictions and exceptions for this functionality. It cannot be used with Bill of Exchange Banks/Accounts and it does not work with the Foreign Exchange currency valuation process (FAGL-FCV).

This last statement is not 100% accurate to a certain point. FAGL_FCV is still able to take into consideration the Bank Sub-Ledger Open Item accounts and the Main Bank accounts balances to calculate and valuate them. But what it does not give you is a detailed posting that will specify which House Bank and Bank Account ID has valuated, but it will still valuate those amounts properly. So one of the main restrictions might not be that important at all.

I reached out to SAP with an OSS Support message asking if S/4 HANA 2021 had remediated this situation and FAGL_FCV was now working properly and they told me that they have not modified this program yet and that they will not modify it for future versions. Instead, they told me they are planning on releasing an "Advance Valuation program" that will handle it properly. But there is no timeline for that release yet.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Nov 12, 2021

How to manage Company Code Registration numbers that belong to other Country different than the one in your Company Code ?


When you create a Company Code in SAP for a specific country (Ex. US), the Company Code will allow you enter certain additional parameters for that Company Code that are typical for a Country US to have them. (Ex. TIN - Tax Identification Number, and some other registration numbers that are typically required for companies doing business in the US).

For example, for Canada you will need to enter the GST Registration number, PST or QST numbers as well since your Company Code will be in country CA.

This is an important piece of information that is typically needed for Legal requirements in many countries to issue invoices showing your Registration number when you print an invoice or send information to the banks, Tax agencies or other regulatory entities. So this becomes a MUST HAVE to have it available in the system ready to be used by any of these processes.

But now, what happens if you have a US Company Code, that is registered in Canada and you need to enter the GST Number for it ?

In standard SAP configuration, you will not be able to do it within the Company Code Global settings (or Maintain Additional Parameters as of S/4 HAHA 1709), those fields for GST number will not be available since your Company is in country US and that is not required in the US.


How do you enter this in your US Company Code ?

There are 2 options

This configuration to allow you to enter this parameters is based on "Parameters Types" that are preconfigured for certain countries. In the case of GST Number, it is Parameter Type F29T01. You can look for your specific Parameter Type in Table T001I


Option # 1

 

SM30 on Table V_T001I

 

Remove country CA for this record "Parameter Type" = F29T01

If you do that, this will make type F29T01 available for any Company Code regardless of the Country. So all Company Codes all of a sudden will have available the GST Number to be entered  even belonging to any country in the world. (For me this is not the preferred approach).



Option # 2

SM30 on Table  V_T001S4H

Enter a new entry for F29T01 record and assign it to country US. If you look for F29T01, it is already assigned to Country CA as well.



This will allow to enter the GST Number for all US CoCodes. For me, this should be the preferred approach as it only affects Country US and not all countries.


Be aware that this could be overwritten by an SAP upgrade, so it might need to be redone later.


Once you are done with the configuration, you will be able to enter the foreign registration numbers in your Company Code within Company Code Global Parameters or Additional Parameters.




If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Aug 5, 2021

How to send the different Finance outputs via email ?

A typical Business requirement now a days in today's world is to send certain Finance Outputs via email automatically from SAP as soon as you generate them. In S/4 HANA on-premise (and ECC as well), to be able to achieve this you need to deal with a little bit of custom code that needs to be added and managed in BTE's (Business Transaction Events). These ones are similar to any other User Exit.

As you might know already, there are different automatic processes in SAP that trigger outputs and when these are created, these BTEs will be called and eventually the generated outputs will be sent via email by SAP automatically if you put the appropriate code on them.

I will be talking in these post about 4 different important and key Finance Outputs:

    • Vendor Payment Advice
    • Customer Statement
    • Dunning Notice
    • Interest 

In this post I will NOT explain you how to execute the programs that generate them, what options and/or values to put in the screens. As a Consultant you should know how to run these "user" steps already. There are plenty of content about this on the Internet already. This Blog post is not for that, is to explain you the technical side of what you need to do to achieve sending these Outputs via email.

Each one of these Outputs have at least 1 (some even 2) Function Modules that need to be coded to be able to control if the output goes to the Spool, Email / Internet or Fax (nobody uses a Fax anymore). Other things that you can control too are the Sender email, Recipient email, email Subject and email body text. All of these can be managed as part of your custom code.

These FMs get assigned in Tcode FIBF and are based on a Transaction Event number that corresponds to each one of these different outputs. You cannot reuse the same code for all of them, but the idea and logic can be replicated for all of them.



1st you need to start by "registering" your Product / Output for the ones you will be dealing with. In my case, ZDUN, ZPADV_EM, ZPADV_SJ. But you can put any Product name / code that you want, then you need to flag the "Active" field as well; otherwise it will not be called.




Then once you have your Function Modules with your Custom code, you need to enter the FMs.



You go into Process Modules -> of a customer to assign them.


You make entries in this table to assign here the FMs that you have coded to manage the different options like (sender / receiver email, subject and body text). As you can see, the Process numbers represent the individual outputs.


Payment Advice

Payment Advice has 2 processes 2040 and 2050. 

Process 2040, allows you to manage Sender email address, Recipient email address, Mail Body Text and the type of Transmission Method 

  • c_finaa-nacha = '1'  - Spool
  • c_finaa-nacha = 'I'  - Internet / Email

For this SAP gives you a "Sample" FM that you can use as a base to copy and insert you own custom code. This is SAMPLE_PROCESS_00002040

Process 2050, allows you to manage email Subject

For this SAP gives you a "Sample" FM that you can use as a base to copy and insert you own custom code. This is SAMPLE_PROCESS_00002050

Customer Statement

Process 2310. This one allows you to manage Transmission method, Format (PDF), Mail Body, Mail Subject, Sender and Recipient email address.

For this SAP gives you a "Sample" FM that you can use as a base to copy and insert you own custom code. This is SAMPLE_PROCESS_00002310

Dunning Letter

Process 1040. This one allows you to manage Transmission method, Format (PDF), Mail Body, Mail Subject, Sender and Recipient email address.

For this SAP gives you a "Sample" FM that you can use as a base to copy and insert you own custom code. This is SAMPLE_PROCESS_00001040.

Interest Output

For the Interest Calculation done out of FINT Tcode, I will refer you to the below Blog post where it explains Badi FI_INT_CUS01 that needs to be implemented to manage all the same things that we did for the other Outputs. 

By the way, excellent Blog this one for Abap resources. 


The specific Abap code of each one of these Function Modules need to be done based on your particular case. I am not providing you the code in this Posting. Maybe this will be the subject of a subsequent Post.


** Note: This process applies to S/4 HANA On-Premise (ECC as well) only. For S/4 HANA Cloud the solution is completely different and based on the new Output Management Solution that works with BRF+ technology. I am not aware that this has changed for S/4 HANA 2020 version either. If you know, please let me know ...


Some Reference OSS Notes:

◈ Payment Advice: OSS 1033893, 836169
◈ Dunning: OSS 1042992
◈ Customer correspondence (Ex. Statement): OSS 1360070
◈ Interest : OSS 956981


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

May 3, 2021

Send any Material price from one Plant to another without any custom development

A typical requirement from multiple clients that I had in the past, was to be able to send, copy or propagate Material prices (costs) from one Plant to another. In certain manufacturing scenarios that is not a problem, because one Plant might be manufacturing one product to a certain stage and the other Plant picks it up and continues manufacturing this Material. In this case, the 2nd Plant has a BOM (Bill of Material) that includes the Supplying Plant. This way when you do the Cost calculation, everything flows automatically. 

But ... 

  • What happens when you reach the final Production Plant and then you send it to other Plants that are just Warehouses or Distribution Centers ? (with no transformation here)
  • How do you update the prices in those DCs when your standard costs changes in the Manufacturing Plants, so you can revalue your inventory ?
  • Is there a way to do it for hundreds or thousands of Materials and multiple Plants without having to develop a mass upload program ? (There is a new Fiori App to upload Material prices from XLS, maybe I can talk about it on another post. But it is not the same as this solution and requires manual work and one XLS per Currency Type that you might have).
  • How can you avoid doing manual MR21 price changes and tons of copy & paste in that screen ?
  • How can you avoid doing a custom program development to do this ?
  • How can you avoid doing an LSMW out of MR21 to automate this ?

Well ... There is a way that I found a few months, that works like a charm and does not require any custom developments !!! zero !!!

The Solution

Transaction CKML_PRICES_SEND could do that for you with some extra tweaks ... !
First reaction someone that knows FICO would tell me ... "well it starts by CKML, it is a Material Ledger Tcode, I don't have Material Ledger or I don't want to deal with Material Ledger ...". Even if it starts by CKML, it is not Material Ledger exclusive, it can be used without Material Ledger being active too.

Note: As per OSS Note 646630 - Send Material Prices and BAPI_MATVAL_PRICE_CHANGE usage, this solution has been arround since R/3 4.7 and it is part of the Enterprise Extension EA-FIN. So you can use it in ECC and S/4 HANA any version as well. I am using it in S/4 HANA 1909.

This Transaction is designed to be used with Distributed Systems where you need to send Material Prices to other SAP systems, but with a few 100% standard tricks, no code modification, you could use it within your same system. 

This Transaction triggers and Outbound Idoc (Message Type MATERIALVALUATION_PRICECHANGE) that contains all the necessary info for price changes. The Idoc will go out from your system and come back right into as an Inbound Idoc. Once successfully posted, your Target Plant will have its Price updated. It will trigger one Idoc per Material. You have to execute it individually per Target Plant. You could do it for all Materials, or for certain ones, Valuation types, Stock Type, Material Type and many other filtering options.


How ?

Step #1 - Do config in Table VCKML_PRICE_SEND (SM30)

The Target Plant needs to have an entry in this Table, so you need to add it here manually and save it in a Transport. This is the only step that needs to be done in your Golden configuration Client. All the other Steps need to be done individually in the all the Transactional environments (Unit Testing, QA, pre-PROD and PROD).


Step # 2 - Define a Logical System

Tcode SALE or BD54 directly, to create a "fake" Logical System. (This depending on your environment you might be able to do it yourself as an SAP Finance Consultant or you will have to ask your Basis Team to do it for you).

This fake Logical System will be used later in the Distribution Model. We need a "fake" one because within the Model, you could not use as Source and Target the same Logical System, so that is why we have to fake it. If you understand what I am talking about, you might be guessing already how this all ends .... If not, keep on reading ...

In my case my real Logical System is SX4CLNT050, so I created a fake Logical System called SX4CLNTX50.


Step # 3 - Create an ALE Distribution Model

As I mentioned, this needs to be done on each environment as it is specific to the Logical System names that you will have on each environment.

Tcode SALE


Create the Distribution Model


Add a BAPI 

BAPI MaterialValuation
Method PriceChange

End result will be like this, where the Receiver server will be your dummy / fake Logical System.


Step # 4 - Create a Port that calls the RFC of your own Client

Tcode: WE21. Before creating the Partner Profiles, you need to define a Port that will be used by the Outbound Idoc. In this case, the Port needs to be pointing to an RFC for the same Client that you are in. In my case SX4CLNT050. By doing this, you will trick the system to send the Idoc to a Port that it is also within the same system (go out, to come back in within itself). This is the KEY part of this whole Solution to make it work.


Step # 5 - Maintain Outbound Partner Profile for Logical System

Tcode: WE20. Here the Partner System (LS) will be the same I input in my Distribution Model and fake Logical System "SX4CLNTX50" as Receiver with the Port that I just created that points to an RFC of the same Client.




Step # 6 - Create Inbound Partner Profile
Now we will create the Inbound Partner Profile for the real Logical System (SX4CLNT050).


Process Code It could be BAPI or BAPP

Make sure this is ACTIVE, because it could become INACTIVE in some cases.



Once you completed the Setup and different steps, you are ready to Send your Material Prices with CKML_PRICES_SEND


An Outbound IDOC will be created and then another Inbound IDOC will be created as well. The Inbound will end up posting the PRICE CHANGE Document.


Note:

  • This process will still work if the Sending Plant Material has Price Control S - Standard (or V - Moving avg.) and the Target Plant Material has S or V. It will still update it. For sure, this does not apply to a Target S Price where it is a Costed Material with a Cost Component split. In that case you still need to execute a Cost Estimate calculation.
  • Each MATVAL_PRICES IDoc segment will represent the Different Material Valuations / Currency Types active in the system (Ex. In my case 10, 30, 31)
  • This process only works within CoCodes that have the same currencies. You cannot transfer a price from one CoCode that has Currency 10 = CAD to Currency 10 = 10 USD. It will not work, you will get a currency error because the Outbound Idoc contains CAD and the Inbound Idoc too while it should have 10 - USD, there is no transformation in between. So the only way for you would be to pass it through a Middleware, and recalculate with the new Currency, if needed.
  • For an unknown reason, my Inbound IDOCs are not getting process automatically, instead I have to go and trigger the process to post them manually based on their actual status. So you can always schedule the program that posts them periodically to remediate this situation.

Once the Document has been posted, you could verify it by Displaying the Material Document (CKMPCD) or the Material Master Accounting View or the FI Document too.
















If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Feb 12, 2021

SAP Finance conversion objects, all about them

This post is the continuation of my previous post "What is the Data Migration process about ? (Data Migration 101)? where I was explaining the process of Data Migration.

In this post, I will talk about the SAP Finance Migration objects which is my key area of expertise. I will describe the specifics of each objects, its strategy, dependencies and how they fit together in this whole big puzzle.

I will describe them in order, sequence how they need to be created. Some could be created in parallel and some need to be created after others have been created.

The first pre-requiste to start doing Data Migration is that your whole configuration has been transported into Production or your Target environment that you are migrating too.

Extracted Data from the Legacy system should be done all at the last day of the closing period prior to the go-live date. If the go-live is March 1st, then data should be extracted as of the books being closed on February 28th (or 29th for a leap year).

The ideal time of the year for a go-live from a Finance perspective is the 1st day of the new Fiscal year as it simplifies several of these conversion objects, specially GL Balances and Fixed Assets with its values.


Bank Master (Bank Keys)

This is the whole list of Bank numbers, with its names and in some cases their addresses too. This is needed in SAP to be able to perform electronic payments (or Direct Debits too) and register Vendors / Customer Bank account details. SAP normalizes all these data in order to avoid issues while generating payments that could result in payments being rejected by Financial Institutions. If you have a small volume, this information could easily be validated online in many web pages and also in each Bank list of branches. If you have a large volume or anticipate it, there are companies that provide this Bank Directory for specific countries or worldwide ($$$). Ex. Accuity or SWIFT. Bank Keys is a pre-requisite to load Business Partner Vendor / Customer.

Bank Keys Master Data can be migrated using a standard SAP delivered object in the new S/4 HANA Data Migration Cockpit in case you did not purchase the services to obtain Bank Directory Data.

My personal preference: SWIFT Service


Chart of Accounts

This is the heart of the Finance Module in SAP. Without it almost nothing can happen. This contains every single General Ledger account. It can be loaded with the old LSMW, it can be Transported (see this post post that I did in the past) or it can be replicated from your Golden environment into other environments through ALE (this other post too). The chart of accounts is a MUST for all Finance transactional data that will come after, like Fixed Assets values, AR/AP Open Items, GL Balances, Standard Cost calculation and Inventory load.

For S/4 HANA 1909 version, SAP has finally delivered a Migration object for GL Account within the Data Migration cockpit. This is something much awaited since the 1st S/4 HANA version.

My personal preference: create everything in the Golden configuration environment with LSMW or Cockpit and then ALE to other environments.


Profit Center

Another Finance object that is key for SAP Finance. Depending the size of the company and operations they have, this could be a small number of Profit Centers Centers or a large number of records. Based on that if we are talking about a handful of records they can easily be created manually; but it is almost never the case. For a large volume of data, we could always use LSMW or the new S/4 HANA Data Migration Cockpit which it comes out of the box with a Migration Object ready to use. Also another alternative to keep all your environments in synch., you could use ALE to Distribute the Profit Center Master Data (see this post that I did in the past). Profit Centers are a pre-requisite for Cost Centers as Cost Centers need to be assigned to a Profit Center.

My personal preference: create everything in the Golden configuration environment with the Cockpit and the ALE.


Cost Center 

Another Finance object that is key for SAP Finance. Depending on the size of the company and operations they have, this could be a small number of Cost Centers or a large number of records. Based on that if we are talking about a handful of records they can easily be created manually; but it is almost never the case. For a large volume of data, we could always use LSMW or the new S/4 HANA Data Migration Cockpit which it comes out of the box with a Migration Object ready to use. Also another alternative to keep all your environments in synch, you could use ALE to Distribute the Cost Centers Master Data (see this post that I did in the past).

My personal preference: create everything in the Golden with the Cockpit and then ALE


Fixed Assets (Master Data & Values)

This will contain all your Fixed Asset Master records for the entire company with its acquisition values, accumulated depreciation, useful life and depreciation method (Key) among some other data too. There will be many companies that will have assets that are fully depreciated (Acquisition value = Accumulated Depreciation) and that they are still migrating. This is totally normal as they are needed for tracking purposes.
Now a days in S/4 HANA, the Migration of Fixed Assets has been simplified (see this post that I did in the past to learn about the different migration methods) .

When loading the Fixed Assets values, the system will do the following Journal Entries:

Debit  Fixed Asset Balance Sheet account for Asset Class
Credit Accumulated Depreciation Balance Sheet account for Asset Class
Credit  Fixed Asset conversion account *

* This account is a Balance Sheet Account (Ex. 399998) that after loading your GL Balances, its balance will be zero.

Pre-requisite for Fixed Assets Master data are the Cost Centers and the Chart of Accounts.

My personal preference: LSMW with BAPI as described in my previous post


Projects & WBS

This will create all your Projects and their WBSs underneath them so later on you can load the project accumulated values for the ongoing projects. As a pre-requisite you will need Profit Centers and Cost Centers.
Make sure your projects are in status release otherwise you will not be able to post values to them later on when you load your balances. You should not migrate projects and its values if they are already closed or completed.
For ongoing Capital projects, you will post their values when you will be loading GL Balances. Instead of loading Balances to your Balance Sheet Fixed Asset AuC account, you will post all the details to each WBS into a P&L account (or many). After loading the GL Balances and having posted to WBS, you will run period settlement for those projects.

Posting through GL Balances upload

Debit P&L Expense account with Cost Object WBSs
Credit/Credit All the rest of your Trial Balance

On Project Settlement

Credit  Project Settlement Account or Original Account (depending on config) with Cost Object WBS
Debit  Balance Sheet Fixed AuC account *

* The total of all your loaded project values should be equal to your Balance Sheet AuC, if not you have an issue that need to investigate and reconcile.

For Expense Type projects, you should only load the Master Data and not the GL values as your project values settle to Cost Centers and your P&L will already contain those Settled Values. Unless you can easily strip those values from the Cost Center expenses, you should not post values to it. In case you do, you will repeat the same process as with the Project Settlement for Capital projects but for Expense projects.

Project / WBS Master Data can be migrated using a standard SAP delivered objects in the new S/4 HANA Data Migration Cockpit.

My personal preference: Migration Cockpit, as it could get complicated with an LSMW.


Internal Orders

Here you will create this Master Data that will have as a pre-requisite Cost Centers and Profit Centers too. The rest of the process is exactly the same as with Projects & WBSs for its values.

Internal Orders Master Data can be migrated using a standard SAP delivered object in the new S/4 HANA Data Migration Cockpit.

My personal preference: Migration Cockpit but it depends on the Order Type as the Cockpit did not cover certain types. Not sure if they have improved that now in newer versions, I hope so.


Business Partner - Vendor and Customer

This will create the Master Data of your Business Partners with all the different BP roles required for them to fully operate (Sales, Purchasing, FSCM, etc.). There are several standard SAP delivered objects for both of them within the Data Migration Cockpit that cover the different situations, including Credit. Before 1909, Credit was not covered and you had to develop your own custom object to be able to load Credit limits.
Once you have your BPs loaded, with their corresponding roles and extended to the different Company Codes, you will be able to load your AR / AP Open Items.

My personal preference: Migration Cockpit, no other way. Worst case an API could be used if your System of record for BPs is other than SAP.


AP / AR Open Items

This will contain the detail of every single unpaid Vendor and Customer invoice with its corresponding invoice number, document date (the invoice original date), payment term, original value in original currency and original currency. Posting date for this MUST be the last day of the month prior to go-live. Value in Company Code currency should be already re-valuated at month end rate in case it is a foreign currency document.

When loading the AP / AR Open Items, the system will do the following Journal Entries:

Debit AR Subledger / AR individual Customer Account
Credit AR Conversion Account (Ex GL BS 399997)

Debit AP Conversion Account (Ex GL BS 399996)
Credit AP Subledger / AP individual Vendor Account

* One entry per Document / Invoice loaded.

The total of your AP Subledger in your Legacy System should be equal to the Sum of all migrated invoices. I know it sounds obvious, but not all legacy systems are good in terms of reconciliation, data consistency and accuracy. I have seen this several times in the past. In this case, as part of your data cleansing exercise, you should eliminate and adjust these inconsistencies. The same applies for AR.

AP / AR Open items can be migrated using a standard SAP delivered objects in the new S/4 HANA Data Migration Cockpit.

My personal preference: LSMW with an IDOC for FI Document. I have developed a really good one over the years and still works like a charm.


Inventory related Objects

In case you are using Product Costing and doing Manufacturing execution in SAP, you will need these series of objects that are related to Inventory values.


Activity Types

Activity Types are created prior to the creation of Production Work Centers and created in conjunction by your Production Team / Consultant. They can easily be created using the standard delivered S/4 Migration object for Activity Types or manually if they are just a few.


Activity Types rates

After having created the Activity Types, Finance has the responsibility of assigning rates to it. This is done through Tcode KP26. This could be done manually or within the config of KP26 you could create an Excel loading template so you could have a large number of records to load. After this, Production will be able to create their Costing Tab of their Work Centers, otherwise if there is no rate, they will get an error.


Material Prices (moving average)

During the load of your Material Master records and the creation of the Accounting Views of them, you need to assign a Material Price to those materials that have Price Control indicator V - Moving average. If you do not do that, when you load inventory quantity or try to calculate Standard Cost, you will get errors or Inventory with no value on it.
This should be done by and in conjunction to your Material Master Team and within their Data Migration object (Ex. it could be an LSMW, or a new Data Migration Cockpit object, or another method that they could have decided like a BAPI or API).

Reconcile this with your Legacy system values and be ready to explain if there are any differences.


Material Prices (standard)

As part of the creation of the Material Master Accounting views, for Material that will have Price Control Indicator S - Standard; you need to at least put a value during the Material record creation. This value will be overwritten by the standard cost calculation process later on. In this case I always suggest putting a penny (0.01), just to get by and be able to save the record.


Standard Cost Calculation

Many people consider this as a conversion object, I don't. From my point of view, this is more a process than a conversion object, as you do not develop any program or conversion object for it. You just need to run the Standard Cost Calculation process (Tcode CK11N individual or CK40 mass). Once you obtained a suitable Standard Cost, you will release it and your Material Master records will have a Cost for those with Price Control Indicator S - Standard. If you do not do that, when loading inventory quantities you will get Inventory with no value on it or a penny as described before, which is almost the same as no value.

Reconcile this with your Legacy system values and be ready to explain if there are any differences which there are high chances there will be. In many Companies some sort of dollar threshold is established prior to this exercise to accept differences. In some other cases, there are Management calls right after to discuss these differences and sign-off on them.


Inventory Qty load

This is where it all ends for Inventory. Your MM or WM Team will load their Inventory Quantities and based on the Material Master Cost assigned in prior steps for the different materials (S or V), the system will do Qty * Cost and post a Journal Entry for that Inventory Initial load process.

Initial Inventory load Movement Type 561 will trigger FI-MM Account Determination GBB-BSA.

When loading Inventory, the system will do the following Journal Entries:

Debit Inventory Balance Sheet Account (as per Valuation Class of the Material record) (BSX)
Credit Inventory Balance Sheet Initial load / Conversion Account (Ex 399999) GBB-BSA

The total of your Inventory load resulting from the Quantity load should be equal to the Sum of all Inventory Balance Sheet accounts that you will load in your GL Balances. If not you have an issue and should reconcile it.

As a suggestion, I strongly recommend this approach before loading inventory quantities... Once you have Moving AVGs and Standards in your system, you should extract them and put them in Excel. Then before loading the Inventory Quantities, ask for that file and dump it in Excel. Do a quick Vlookup to populate all Materials with prices. Then do Qty * Cost and do the full sum. If it Balances to your GL, you should go buy yourself a lottery ticket. If it does not balance, which is most likely the situation, start a reconciliation process Material by Material. As said before, in many Companies some sort of dollar threshold is established prior to this exercise to accept differences. In some other cases, there are Management calls right after to discuss these differences and sign-off on them.


GL Balances

Now we reached the last of the SAP Finance Conversion objects, the most important one and the one that closes the whole loop and series of conversion accounts. After posting the GL Balances, all your 39999x Conversion accounts will go down to zero. If not, you have an issue derived from any of the previous conversion objects that you need to analyze and reconcile.

As previously described in each individual object, you cannot map your GL Balances for the different subledgers and/or modules (AR, AP, Inventory, Assets, Projects and Internal Orders) directly to the subledger accounts, instead you need to map those GLs to the 39999x Conversion accounts. Each subledger loaded have posted the offset to the conversion account. Then the GL Balance upload will come to offset those GL Conversion Accounts (39999x).

Remember, GL Balances needs to be done with a posting date as of Dec-31st (in case your go-live is Jan 1st). 

Note: always pay attention to your Parallel Currencies amounts, not just your Local / Company Code currency amounts. This is normally a challenge because not many Legacy systems have Parallel Currencies capabilities like SAP.

My personal preference: Migration Cockpit object delivered by SAP.


Timings 

Inventory load will happen right during the go-live weekend as Inventory quantities are crucial to be able to start Producing and Shipping in the new system after go-live.

Fixed Assets load will most likely happen during the 1st days after the go-live as the company still needs to close their Fixed Assets in their Legacy System, then Extract, Transform and get ready to load into SAP. So expect to receive that info a couple of days after. And this is totally normal.

AR/AP Open Items load, will most likely happen during the go-live weekend. These Open items are needed right after you start operating in the new system so you can pay your Vendors and apply cash received from Customers. There could be a minor delay of 1 or 2 days max, but more than that will have an impact on the business.

Project and Internal Order values will happen with the load of the GL Balances.

The load of your GL Balances (or the whole Trial Balance), cannot happen during the go-live weekend. No Accounting / Finance department closes their books in a weekend or during the go-live. So, information to load the balances will come no sooner than mid-month after the go-live date. Only once the books have been closed, all JEs posted, everything reconciled; then Finance will be able to produce a file ready to upload the whole Trial Balance. This needs to happen before you start your 1st Month End exercise in SAP, otherwise you will not be able to produce proper Financial Statements.


Note on LSMW 

Before people start jumping at me ... I want to clarify something on this.

SAP has released notes that LSMW is not the preferred Migration tool for S/4 HANA, that they will not support it anymore, it is at your own risk and you should use instead the new S/4 Migration Cockpit. All good with that. But .... Where I see an issue is in trying to use LSMW in Objects like BP where you cannot use recordings in those screens and some other screens (like ABLDT for Fixed Assets values) where you have tables and you cannot manage that properly with recordings.

But LSMW is not only about recordings. You can use BAPIs and IDOCs with it and still be 100% S/4 HANA compliant, as those BAPIs and IDOCs have (most likely) been updated to be used in S/4 HANA. So using them will not break anything and data will still be properly created. LSMW with BAPIs and IDOCs is still way more performant than the Migration Cockpit, way faster. I have lived that 1st hand. The only thing that you need is people like us (old SAP Consultants) that know how to use it for each an every single object.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Jan 19, 2021

What is the Data Migration process about ? (Data Migration 101)

I decided to write this post as many times during the many projects that I have done over the last years, I keep on receiving the same questions over and over about what is the information that needs to be converted from a Legacy System into SAP, what is the strategy that we should follow and how to approach the different objects.

I haven't read anything on the subject out there, so I decided to write a Post about it.

Data conversion is the process of moving and translating the data from your Legacy System (any type or brand of legacy system) into your Target system during a system implementation so the new System can take over the operations and have the data ready and loaded for it. This process is called ETL (Extract, Transform, Load). 


Process

You will Extract all the different data objects from your Legacy System Database. This extraction could be done into multiple formats. It could be flat files (ASCII or Text), it could be into Excel Files, Access or SQL Database, into any other type of Database, you could use SAP Data Services (SAP's product for Data transformation and loading into SAP), or it could be into Staging Tables. Your data source might not be necessary a database, it could be Excel spreadsheets too.

Once the Data has been extracted, you will start the Transformation of it. This is the process where you will build your Dictionary that will help you understand / translate your Data Values and records between your Legacy System and your New System. Why ? Because your new system will have different fields, new fields and fields that do not mean the same and/or are not used the same way as in your old system. So for that you will have to build correspondence tables between one value in the old and the new value. There could be values that will have a 1:1 relationship, n:1 or 1:n, so you have to build all those mapping rules.

Ex. Your Customer or Vendor Numbers might need to be changed from one system to the other, Your General Ledger Accounts numbers might be different, or your old system might not have Cost Centers and your new System does. 

Like this you could have hundreds of examples within the different areas / modules of the system where you will have values that will need to be replaced by new values.

Now these mapping tables, are not static; some of them will evolve during the course of your implementation project so you need to instrument a mechanism to maintain them updated to prevent any records being rejected by your new system because you did not maintain the mapping. Some will even need to be maintained up until the very last minute. Ex. Customers and Vendors.

Once you have all the data mapped, you will be ready to start your Load process. This is where you start Migrating and saving (loading) the data into your new System. This is an extremely important process that will impact the data quality and accuracy of your new system. There is a say that we repeat over and over on projects ... "garbage in, garbage out". What does it mean ? If you load garbage into your system, you will output or have garbage results. And of course you do not want this to happen after having spent several millions dollars and several months in the implementation of a new system.


Data Cleansing

One of the most important aspects that needs to be taken into consideration all along this process that will work towards the success is Data Cleansing. All Legacy Systems, with no exceptions (even an old SAP one), will have garbage data or data that you do not want it to make it into your new system because you do not need it anymore. You will have duplicate records, incomplete records or records that have been created by mistake. In some systems you might even have corrupted data that cannot be used or even repaired anymore. All this data needs to be looked at and cleaned. If possible, this data should be removed from the Legacy System prior to its extraction, should not be extracted at all or should be cleaned up after extracted. But it should never make it into your new system. This should not be negotiable.


Going back in time

You will also need to establish certain cut-off criteria and decide how far do you want to go back in time to bring your data over. As a rule you could establish that you will not transfer Customers or Vendors that you have not done business with over the last 24 or 18 months. Any one beyond that point, should be disregarded and not transferred into the new system. Same applies for any other data objects that you will be migrating. This decision will be influenced by the type of business and industry that you are in. It is not the same for someone that it is in the retail business and sells through POS (Point of Sale) machines and does not manages his Customers by name versus being a Utility company or the Government that manage millions of Customer records and might even be subject to strict regulations on data retention periods.

This cut-off criteria will also be managed on an object by object basis, as it might be different for Customers, Vendors, Materials and so on.

These are all ground rules that need to be established as part of your Data Migration Strategy.


Reconciliation

During each of the steps of the E-T-L process, you will also need to establish reconciliation processes to ensure data consistency and accuracy to avoid "losing" records in between each of the steps and overall during the process. Ex. If you are transferring all your Vendor Invoices you need to be able to know what is the actual total amount of it in your Legacy System, what is the total after extraction, total before loading and finally the total after loading it into the new system. They should all be the same, if they are not, you should reconcile the differences, be able to explain and/or remediate them. Each and every single converted object should be subject to the same reconciliation process.

Depending on the number of records and type of records that you are converting, you might want to establish different approaches like random checks, spot checks, statistical samples or 100% record check. The first that should be applied is based on number of records. Then, if you are dealing with Quantities or Currency amounts, those should (or must) all balance to the penny. 


Mock loads

This process will and should not be done only once. This is an iterative process that will take several attempts with different targeted accuracy until you reach 100% (or almost). During the course of your project you will repeat this end-to-end process several times. Depending on the size of the project it could be 2, 3 or even up to 4 times where you will have to establish different and increasing accuracy levels. This attempts are called "Mock data loading" (Mock 1, 2, 3, 4). These will be really big milestones in your project.


Build

Individually the ETL process steps while your are working and building them, you will have many individual tries and attempts isolated from the whole process. 

You will build your Extraction program or query and while at it, you will tweak it and refine it until it meets all the requirements that you established for it.

You will work on your Transformation process, build it, map the data from the source to the target data structure. Test the transformation process.

And finally you will work on your loading program/s. You will attempt to load 1 record (happy path or sunny day), clean all the data errors and solve any issues that your program might have. Then attempt to load 5 or 10 records, analyze the issues, rejections and fix it and load again. Then you will attempt to load specific and complex data scenarios (rainy day). Repeat until is working correctly. Finally you will go for the volume. This is were you will expect that everything is going to blow up in the air and have tons of errors. It could be expected. You have to work on them until every single record passes.

Once you have all of that, you would think you have a solid process. At that point is where you would have planned to run the whole ETL process for the specific object that you are working with. And of course adjust and fix.

Finally, you will attempt to load all the different conversion objects (Customer, Vendors, Material, GL Accounts, Inventory, etc, etc). All the previous described steps are all cogs of a big and complex machine that is your whole data migration project.

As mentioned before, you will do several Mock runs, which are nothing more that a play rehearsal. All pointing towards the big event that will be your final cut-over event which is when you will go from your Legacy System to your new System.


Environments

During this iterative process of building your whole ETL, you will generate a lot of throw away data, as you are building your ETL process. This will pollute your Database and you could be impacting other people that want to test programs, processes and reports that could be distorted by your data. So that is why, all these attempts should be done in parallel environments, instances and/or databases to the ones used by the rest of the project team.

For this you will have to plan to have separate instances and/or Databases that will need to be refreshed / wiped out several times during the course of the project. 


Dependencies

In the overall plan of this Data Conversion work, you need to work on scheduling and putting each conversion object in the right order of execution. This cannot be done on the day of the cut-over, this needs to be done early on in the project. You need to establish the right sequence of events and loads. For this you will build something like an Ms Project Plan, Gantt Chart or Perth Diagram, establish relationships and dependencies (Start-Start, Start-Finish, Finish-Start, etc).

Examples 

  • If you want to load your GL Balances, you will need to have previously loaded your Chart of Accounts that contains the list of GL Accounts. Otherwise your system will not have your GL accounts to load your Balances.
  • If you want to load your Bill of Materials (BOM), a pre-requisite (dependency) will be to have loaded your Material Master records.

Every single conversion object could have a predecessor/s and a successor/s.


Data Snapshot

When doing your Mock rehearsal exercises it is extremely important that the data that you extract be taken out of the Legacy System (or Systems in there are more than one) all at the same time (or almost). Why ? For consistency and reconciliation purposes. If your Data Extraction (snapshot) for one module or system has a time gap in between, and new records are created or updated in the other system that uses related data; you risk having inconsistencies that make your reconciliation efforts much harder or in some cases almost impossible.

Example

  • Inventory values are handled in your Finance system that has an interface with your Warehouse system that handles Quantity. If you do not extract both at the same time and you have days in between, your Amounts will not balance with your Quantities.


History

During many of the projects that I have done in the past, a lot of clients wants us to Migrate historical transactional data into the system. In 99% of the cases this is not possible. In an integrated system, posting transactional data has consequences and impacts that cannot be avoided. Ex. If I want to post all historical inventory movements, any inventory transaction that I post will have its corresponding accounting impact that then will reflect in my Balance Sheet and/or P&L. So If do that, I will have to find a way to counteract the effect of having posted many inventory transactions that impacted my Accounting books. So for that reason, it is almost impossible to post many of the historical transactions and we can only Migrate an snapshot at a given time and date. 

For this reason, companies will have to take this into consideration and provide access in read-only mode to their old Legacy Systems for a certain period of time for traceability, investigation, reference and audit purposes. In the majority of the countries, tax authorities and governments can go back in time "X" number of years and ask you for information. So that is why you might have to keep those systems alive, dump the information in Tables or any other method that would allow you to trace back information and provide it to the authorities.


In my next Post, I will be talking about individual conversion objects in SAP Finance with its particularities and strategy.

SAP Finance conversion objects, all about them


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.