Nov 26, 2019

Positive Pay in S4 HANA times (1709 and up)


I will start by explaining what Positive Pay is (for those who know, you can skip this part). It is a popular Service in North American banks that they provide to prevent fraud on printed checks that you issue. You issue / print a check to pay a Vendor and you mail it via regular Post mail. At the same time, usually by the end of the day, you will produce a file as per your bank specifications for this service with all the issued checks list with date, amount and beneficiary and send it to your bank. 

Once your issued check is presented to be paid by your Vendor at the bank, they cross ref it against the info you have provided them. If any of the data does not match, they will stop the payment of that check and send you an exception notice (online). Then you will decide if the check gets paid or not. This way you can prevent check fraud by someone that could have altered any information in the check (usually amount and/or beneficiary) to get away with it. If you do not have that service, your bank could easily end up paying your check, leaving you on the hook to absorb the fraud.

A few years ago, a client of mine, had a check altered. The mail somehow was intercepted, they copied all the info in the check into a brand new check and a $450 CAD check got converted into $23k CAD check. The bank end up paying it. Luckily, for my client, we also had Electronic Bank Statement (EBS) in place and the next day the check was paid. We were trying to reconcile it, the system could not find that check number with that amount. Then we started investigating in our system and found out that check number was for $450 instead of $23k. They call right away the Bank and thanks God the bank recognized it and absorbed the hit. In this case, Positive Pay would have been ideal to directly stop it from being paid.

How we used to do Positive Pay in SAP before S/4 HANA 1709 ?


Standard SAP Tcode FCHX (Program RFCHKE00) is designed to output a file that provides this exact information (to a certain point) so you can send it to your bank. But …. There is always a but. Each and every bank has its own set of requirements when it comes to Positive Pay and the file specifications vary from Bank to bank, they are not standard like EBS or electronic payments. So this program will not work out of the box for you. If it does, you should buy a lottery ticket !

This program works with 2 internal Structures DTACHKH (Header) and DTACHKP (Line items / checks) where all the “necessary” information is output. I put necessary between quotes, because in some cases your bank might ask you for some extra info.

There were a couple of alternatives to solve this until today:

#1 – Take a copy of this program and make a Z Transaction (this is the more popular one and I have done it a couple of times already). You will also copy these 2 structures and make them Z. Then once you have a copied program, you can control the info you need, the file format, the logic, etc. That way you will output a file as per your bank specifications. If you were to need this for several banks, you will do as many Z programs as banks you have. Or you can enhance if further to have some sort of bank selection so you control different output file formats all in one program.

#2 – Modify the standard delivered structures (DTACHKH and DTACHKP) to include or remove what you need. This does not give you a lot of flexibility and you will be limited to only 1 bank. If in the future you have another bank, this approach will not work.

#3 – Do a full custom program to create your own extract and build your file as per your bank needs. At the end, all the information is somehow stored in Tables REGUP and REGUH. It is more work as you have to build a program from scratch, a selection screen, file handle, etc. In a sense is what FCHX already does. I have seen this approach being suggested by people in Internet. (Not the one I will go with).

#4 – Use FCHX the way it is out of the box, let it create the file. Then create a 2nd program that will grab this file and format it (and enhance if needed) the way you need it for your bank. (An old client of mine did it this way).

#5 – Do the same as option #4 but transform your file in your middleware directly and then send it to the bank directly from there.

As you can see there are many options with pros/cons and more or less effort.

My preferred option was always #1, up until I recently discovered a new SAP standard delivered way that comes with S/4 HANA 1709 (FPS02) which is the environment that I was working with.

How do we do Positive Pay in SAP S/4 HANA 1709 today ?


Way simpler, quicker and less custom (Z) than it used to be ….
Now Standard SAP Tcode FCHX (Program RFCHKE00) allows you to work with a DMEE structure so you can design and output your custom file as per your bank’s needs without the need for an Abap, program or structure modifications. You just design your output file with the Tcode DMEEX (Extended DMEE) and then you can output the file.

As you can see in the screen shoot below, FCHX has a line at the bottom where you can specify your “Payment Medium format”. This last line / option, was not there before in previous versions, like in ECC6.0. Now as of 1709 you have this possibility.

Then when it comes to build your output file, you work it with DMEEX (Extended, not the traditional DMEE. But it is almost the same) the same way as if you were building any other electronic payment file to your bank. But instead of this file being called by your Payment Program (F110), it will be called by FCHX.




Once you have fill out the corresponding info in your selection screen file, you will execute it and a file will be created. (see confirmation message below)


Now … Where is that file going ? It is being created in TemSe or Tcode FDTA (Fiori App “Manage Payment Media”).

Then you go there, enter your selection criteria and execute.


Once you are here, you select the line and download the file as you might already be doing with any other electronic payment file.



In this case I created a custom DMEE called “ZBOA_US_POSIPAY” as per my Bank specifications. In this case Bank of America.

With S/4 HANA 1709, there is a standard delivered DMEEX format for Positive Pay that you can use as an starting point to create your own custom DMEE. This DMEE is called “US_POSIPAY”. You will copy it and create your own Z Format. You can create as many custom DMEEs formats as banks you have. This way you do not need to change your FCHX program. You only need to change the “Payment Medium format” in the execution screen. This way you stay standard and there is no issues when it comes to an upgrade.

This saves you the need to do Abap, and you are completely independent as any Functional FICO consultant with the right knowledge can do it. It does not require development anymore.

Note: FCHX out of the box has an issue to work with any other DMEE other than standard delivered US_POSIPAY and requires you to implement an OSS Note to correct this bug. Probably I was one of the firsts in the world to have attempted to use this solution as it was not working and SAP did not know about it. I ended up reporting it to SAP via an OSS Message. It took me around 2 months of exchange with them before they finally ended up releasing this OSS Note. After that it works well. So now I am able to say that there is an OSS Note out there that was released thanks to me.
This Note applies to S4CORE 102 (1709), 103 (1809), 104 (1909).


OSS Note



If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Oct 26, 2019

Intercompany clearing tables for OBYA


Recently at a customer, that had around 20 Company Codes on their system, I faced a requirement where they asked me to perform their Intercompany config in Tcode OBYA for all the possible combinations / permutations between those 20 company codes.

Considering how this config works in the screen, it is hard to verify that you are not missing any combination and no relationship has been forgotten.

For that you would have no choice but to look directly into the Table in SE16N and do your analysis directly there.




To see this config, you will have to do SE16N on Table T001U – Clearing Between Company Codes


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Upgrading from SAP S/4 HANA 1610 to 1709 will give you a weird PCA message

Here is how to solve it …..

Recently at a customer, I went through an upgrade from SAP S/4 HANA 1610 to 1709 version and right after the technical upgrade was completed by the Basis team; we started having a weird message about Profit Center Accounting (PCA) not being active.

For those of us that have been around for a while and we have worked in the past with ECC6.0 (and even ECC5.0), we know that ever since the introduction of New GL (back in 2004 for ECC5.0 and ECC6.0 in 2005), SAP has incorporated Profit Center Accounting scenario into FI and that you did not require to have activated anymore Profit Center Accounting on the CO – Controlling side.

With this you would no longer have to run reports out of Ledger 8A in PCA and the old tables GLPCA would no longer be filled. Same as GLT0. All of course in the case you would have activated in a greenfield implementation (or Migrated) New GL.

Now all of a sudden, many versions and years later, I was having a weird error message in SAP after the upgrade to 1709 telling me that PCA was not active. Because of this the system came to a halt and no Financial postings could go through. All in a perfectly healthy system that was running without any issues on 1610 without PCA being active.

As a side note, the environment was coming from an old upgraded ECC6.0 to S/4 HANA 1610 but this client never had Classic GL nor PCA activated before. They were right from the beginning in New GL

There was no mention to something like this in the 1709 Simplification Notes (at least at the time on Oct-2018), no OSS Notes on the subject either.

The error message was FINS_ACDOC_CUST415


If you look at the message, it is a little bit confusing. On one side it says that the scenario is obsolete, that needs to be transparent so you need to activate it; and on the other side it says that Activating Ledger 8A is not mandatory. But without that option activated, your system will still not work. So ?? Make up your mind SAP ??

What was the solution then ??

So the solution was to activate PCA Posting in Tcodee 0KE5 and save. But this SAVE, does not trigger a transport, this Tcode does not do it.

Then you would trigger a transport with Tcode 0KEP and move it across your landscape.


After, the posting problem was solved and the old PCA Tables where not getting posted. (Ex. GLPCA), our error gone and the system back to normal.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Feb 28, 2019

ALE Distribution of Master Data

A couple of months (posts) ago I wrote about propagating Chart of Accounts and GLs through ALE to other environments. But GLs accounts are not the only objects that can be transferred through ALE. In fact there are tons of Master Data objects that can be transferred using ALE in SAP. 

I put together a small list of some relevant Master Data objects that could be relevant for Finance (from my perspective).

As I mentioned before in my other post (ALE Chart of Accounts to other environments) , you need to Create / Update your Distribution Model before you can start defining new objects that you want to transfer through ALE.

This is a Table that gives you a list of the TOP objects that I transfer normally through ALE, with their corresponding Message Types, Tcodes to trigger them and setup for the Partner profiles.



These are the Outbound Partner Profiles



These are the Inbound Partner Profiles


Remember, as I mentioned in my other post; once you trigger your IDOCs. You should be able to monitor them in the sending system with WE02 and then in the receiving system with WE02 too. If you have errors, those are most likely from you Partner profiles setups, check them and re-process or re-trigger the IDOCs.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Feb 4, 2019

SAP Product Costing, Cost Estimates Tables


These are some important Tables within the SAP Product Costing module. This tables store all the data related to Cost Estimates (aka Standard Cost Calculation executions).

Each time you run a Cost Estimate (Tcode CK11N individual / CK40N mass), internally a Cost Estimate number / Cost Estimate number is created to store your Material execution.

Table: KEKO - Product Costing - Header Data
This table contains a header record for each Material, Costing Date, Plant. It also contains all the Dates, Lot size, unit of measure, user that has executed it, costing variant used and date that the Cost Estimate was marked and released among many other data.



Then you have other tables that give you the details of the data contained in this Header / Cost Estimate execution.

Table: KEPH - Product Costing: Cost Components for Cost of Goods Mfd
This table contains the total calculated amounts for each of the Cost Components according to your Cost Component config.
Your link between this one and KEKO is the Product Costing Number / Cost Estimate number. There is one line per Header record in KEKO. 
You have Cost Field KST001 all the way until KST040 for a Total of 40 different Cost Components.


Finally you have Table CKIS

Table: CKIS - Items Unit Costing/Itemization Product Costing
This table contains every single line item for your Cost Estimates as per your BOM (Bill of Materials) and Routing. One record per Material contained in a BOM and/or activity in a Routing, with the Cost Element derived from it Plant, Material, determined price, quantity, unit of measure and Total Valuation among many other information.
The link with KEKO is the Cost Estimate number.



With all this info, you could easily reconstruct your Cost Estimate results.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Jan 30, 2019

Fixed Asset Migration in times of S/4 HANA

Fixed Assets Migration in times of S/4 HANA is way more easy and straight forward than what it used to be back with ECC 6.0 (or prior). This is all thanks to the re-structuring and simplification that SAP has done in Finance since the introduction of Simple Finance and subsequently S/4 HANA. Now the Fixed Asset module is completely integrated with the Universal Journal, same as AP and AR. 
Because of that we now have fewer steps, tweaks and reconciliations to do during the migration.

Transfer Date specifications

We will start by setting our conversion date to that last date of our last Fiscal Year in our Legacy System (this taking into consideration that my go-live date is the same as my new Fiscal year start, so no mid-year migration). 

In S/4 HANA 1610 version you still need to set the Transfer Date through the IMG and with a Transport that you will have to promote all the way up to Production as we used to do in ECC6.0.


Here you will specify the last date of your prior Fiscal Year. Ex Go-Live 2019-01-01, the you will enter 2018-12-31.


In S/4 HANA 1709 version, this IMG entry has been removed and replaced by Tcode FAA_CMP_LDT and has simplified it. Now this Tcode can be accessible and edited in any client / environment and you do not need a transport anymore to specify a date. 

Here you get to Specify the Transfer Date, the Transfer Status of your Company Code and also the FI Document Type that you will use when posting the Asset Values to the GL. There is also an interesting point where you can specify that the system will calculate the planned depreciation figures for each Asset already at the time of Migration. If not, you will have to subsequently run AFAR - Depreciation Calculation for that. In times of S/4 HANA and great performances, I do not see why we would not calculate it at creation. 

So once we specified the dates in our config, we are ready for our next step.

Fiscal Year change

If our go-live date is 2019-01-01, then I need to do the Fiscal Year balance carry forward so then the FY2018 will be considered closed in the system. This is a requisite to be able to load my Asset Values as of 2018-12-31. Without this step, I will get an error.

Execute FAGLGVTR to carry forward your balances. Even if you have no postings in FY2018, you still need to execute this step, as it is the one who will mark FY2018 as close in the system. (You can verify this once executed in Tcode OAAQ for your Company Code).

If you have more than one Ledger, you need to execute both Ledgers in FAGLGVTR.

If your FY did not change when you check it with OAAQ, then you might have to execute Tcode AJAB to close the FY and then re-execute FAGLGVTR.

Migration Options

The recommended approach by SAP for S/4 HANA is to use the new Migration Cockpit (Tcode LTMC) (as per OSS Note 2287723). For that SAP has delivered a standard migration object that can be used to Migrate your Fixed Assets Master data as well as the Fixed Assets Balances (Acquisition costs and Accumulated Depreciation). Behind the scenes, this standard migration object uses BAPI FIXEDASSET_CREATEINCLVALUES that takes cares of creating the Asset Master Data as well as the values.
This BAPI creates the IDOCs, but it is exactly the same as BAPI_FIXEDASSET_OVRTAKE_CREATE Business Object Method BUS1022 in LSMW. 



Before S/4 HANA and Simple Finance, in ECC we used to create the Asset Master Data with Tcode AS91/2 and from there we also had the option to enter the "Take over values". Now this last option for the values has been replaced by Tcode ABLDT that at the same time it posts the values to the Sub-ledger it also posts to the GL as well with an FI Document. In fact due to the whole set of changes that SAP has performed since Simple Finance, there is no more Sub-ledger and everything posts to the GL.
Because of this, there is no more need to tweak the setup of the Asset reconciliation accounts ON/OFF to Post a separate entry with the total value of the Migrated Assets. ABLDT does the posting directly to the GL. So, no more reconciliation between GL and sub-ledger too as we used to do in ECC.

The 2nd option that we have to migrate, is to continue using our old and beloved LSMW. There are several OSS Notes related to the subject where SAP does not recommend the use of LSMW anymore in S/4; but in some cases SAP has still not delivered a Migration Object either for certain pieces of Master or Transactional Data. In the case of Fixed Assets, we do have a Migration Object; but still has certain limitations that I will comment later on in this blog post.

Because of some of this limitations, I still continue to use LSMW for Fixed Assets Master Data and Values.
So my LSMW that I built over the years (even back in ECC6.0) still works as a charm and it is fully compatible with S/4 HANA and has zero side effects. Why ? Because I built an LSMW that uses BAPI FIXEDASSET_CREATEINCLVALUES that ends up triggering IDOCs. So this BAPI does still the same that the standard Migration Object that SAP built does. At the end, it is exactly the same. The only trick is that you really need to know how to and where to provide the right values in the right segments or structures, which could be challenge initially.

Once executed (any of the 2, Migration or LSMW), you will have all the Master Data and Values created and posted to the GL; so then you are left with just running an standard Fixed Asset report to verify the total amount of your Migrated Assets and reconciled them.

Limitations of the Migration Cockpit - Fixed Assets object

As per real life experience, I can say that I found certain limitations in the Fixed Asset Migration Cockpit object delivered by SAP. Then I ended up finding a couple of OSS Notes that talk about it (can't find them now).

#1 - For large volume of Assets, it is not recommended to use the Migration Cockpit as the performance is really inferior compared to the BAPI running in LSMW.

As an example, one of my latest clients had 21,000 Fixed Assets to Migrate. With the Migration Cockpit, the calculation was giving me something like 10 or more hours of execution for that amount of records.
With the LSMW using the same BAPI with IDOCs, it went down to 3 hours. This after the application off OSS 2616079 that also improves the performance. But still the LSMW option is still recommended.
There is also another OSS Note where SAP does not recommend the use of the Migration Cockpit for large volume of Assets and recommend instead LSMW, so it is contradicting the other Note.
If your client is an Asset intensive company (Ex. Mining, Oil & Gas or has large manufacturing Plants), chances are that you will easily have more than 10k Assets easily. 
I remember an old Mining company were I had 45k Assets.

#2 - The New Migration Cockpit in S/4 is not able to open XML files bigger than 100 MB. This is an standard limitation (also in another OSS Note). This could sound as a lot if you compared it with ASCII / TXT files, but not for XML based one.

The Migration Cockpit templates are based on XML files that you handle, manipulate and populate normally using Excel. XML files tend to have large file sizes. Normally several MBs. That is due to the large amount of characters that you will find inside them and the tons of tags that an XML file standard has. This ends up creating big file sizes. 

As an example for this client that had 21k Assets to be migrated, an XML file with all the 21k was around 450 MB. So a file containing around 5k records, was reaching and in some cases going over the 100 MB. So I would have needed to create 4 or 5 files of less than 100 MB each. Plus the significant loading loading time of Migration Cockpit vs. LSMW.
In this case, I guess that SAP still needs to improve the performance and handling of large data volumes in the Migration Cockpit.


Additional information
2537549 - Collective SAP Note and FAQ for SAP S/4HANA Migration cockpit (on premise)


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Jan 8, 2019

ALE Chart of Accounts to other environments

One of the big challenges during an SAP FICO implemention is the creation of numerous GL accounts all along the project. Depending on the type of client, project we could start by an standard SAP chart of accounts a build on top of it; but no matter what, we will always have to create new GL accounts. In some projects I ended up creating up to 150 new GL accounts out of a Chart of account of 700 GL accounts, which gives us more than 20% of new accounts to create.
In those cases, you would end up creating some sort of migration program like an LSMW, or something else. But unfortunately, sometimes those accounts do not come all at once from your client as they are still deciding and have not come with a final list all at once. 
So every time a new series of accounts needs to be created, you need to create them in several environments (DEV, QA, Training and PROD). Except for Prod, all the rest will most likely have several clients each where you will also have to create the accounts. At the end you have at least a minimum of 8 if not more environments / clients to maintain if you want to have everything in sync, which could be quite a challenge and effort. This is highly recommended because if do not do it, then you will have inconsistencies and discrepancies all over the place. And in most of the cases you need to create them in your Golden DEV Config client as normally you need to update config for Document Splitting, Automatic Account determination or other config points as well.

So, How do you keep in sync all the environments without a lot of effort ? There are different ways of doing it that I will list here from more to less effort

#1 - Create all the accounts manually in each environment / client.
#2 - Create an LSMW and run it on every single environment / client.
#3 - Transport the entire Chart of Accounts (including the Co.Code portion of it, as I already mentioned in another post) 
#4 - ALE the Chart of Accounts from your Config Golden DEV Client to any other clients and environments (including Prod).

How do you ALE your Chart of Accounts across your SAP Landscape ?

#1 You will need to create what is called a "Distribution Model" that will allow you to do the setup for your ALE (Application Link Enabling (ALE) is a mechanism for the exchange of business data between loosely-coupled R/3 applications built by customers of SAP).

Tcode: SALE (in your Golden DEV Config client, DEV-200)


Create your Distribution Model

First you will create the Model (Like ALEDISTRIB in the image), then you will create a sub level which will be your source client (Ex S4DCLNT200). Always the 3 letters for the instance name, then the CLNT for client and last the Client #. 
After that you will create another level underneath S4DCLNT200 for your Target Client. In my case S4DCLNT211.


Then you will click in "Add Message Type" and this is where you will add your IDOC  / Message Type that we will be used to ALE our Chart of accounts.
We will add two different ones.

GLCORE: Chart of Accounts section / level
GLMAST : Chart of Account and Company Code level (recommended)

As you can see I have other set of Message Types in this Distribution Model which are used for other Master Data objects (like Profit Center, Cost Centers, etc.), which eventually will be covered by other Posts.

Now that you have created the Distribution Model and added your Message Types, you will need to create your Partner Profiles so then when the IDOCs are generated, the system will know what to do and where to send them. There is an automatic option that will do that for you.





After running this process your Outbound Partner profiles will be created in WE20 (see below the result).
** At this point you might receive some errors due to missing RFC connections from 200 to any other client. Raise a request to your Basis resource to create all of these missing RFCs and re-do the Generate Partner Profile step.




At this point your setup in your DEV Config Golden client has been completed. If you want to expand your Distribution Model so you can ALE your Chart of Accounts to other Target Clients or Environments, you need to create another sub-level hanging from S4DCLNT200 and create another one Ex. S4QCLNT400 for a QA 400 Client. And Create the Outbound Partner profiles as well

Create Inbound Partner Profile in your Target Client/s

Now for every Target client that you have defined in your Distribution Model, you need to go in individually and create your Inbound Partner profiles so the Target system can receive and process the IDOCs.

Tcode: WE20 (Ex in DEV-211 as my Target Client)


Once you create it, you will SAVE it.
After this you have finished your ALE Setup that will allow you to Distribute your Chart of Accounts Master Data from 200 into 211 client.

Send Chart of Accounts from 200 to 211 using ALE

Tcode: BD18


This Tcode is the one that will trigger the IDOC creation based on the selection criteria that we will enter. (Ex. Chart of Accounts, Company Code, GL #s).
Then you will specify your Logical Message GLMAST (to transfer Chart of accounts and company code data) and the receiving system.
Then execute and the system will start generating the IDOCs. The process might take a while to generate all the IDOCs, so be patient.

To monitor the progress of this you might want to open IDOC monitoring transactions like WE02 / WE05 on both clients (Sender and Receiver) so you can see the number of IDOC being process, sent, or in error.

WE02 in Sender Client 

WE02 in Receiver Client

In some cases and under unknown circumstances, the Sender system generates the IDOCs but does not process / sends them right away. For that you can execute BD87 to re-process them and they will get sent out.

In future Posts I will publish other Master Data Objects that can be transferred using ALE.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Jan 4, 2019

Regenerate ABAP code for Validations and Substitutions

When you have created Validations and/or Substitutions in SAP, behind the scenes when you save them; the system transforms them into real ABAP code so then when executed them ends up executing ABAP code.
When you transport them into any other environment, you will see that at the end of the transport import the system is regenerating code. But in most of the cases you will still need to regenerate that code for those Validations and Substitutions manually otherwise you could face certain issues (OSS Note 2512768 explains this). These are some of the issues you could face. Ex.


  • When you try to post a Journal Entry with FB50, 60 or similar; it could be annoying that the system will tell you that it needs to run a regeneration program and you need to start over. This could happen twice. If a user receives this type of error he/she will probably freak out and end up raising an incident for something easy, quick to fix and that could have been prevented. Error GB073 - Generation successful, but you must call up the function again.
  • Other type of postings attempts from other modules like MM, could directly fail as those interfaces do not try to trigger these regenerations.


So to avoid yourself any issues, any time you transport Validations and Substitutions; right after the transport went in, you should run program RGUGBR00 (with SE38) to regenerate the ABAP Code for them.


To play safe you should activate all "generate" options * on "Appl area" and "Callup".


This not only applies to FI type of Validations and Substitutions, it also apply to all other types that exists in other SAP Modules like PS, SL, EC-CS, etc. So, this recipe should also be followed for those as well.

Other info on Validations / Subtitutions
842318 - Frequently asked questions about validations and substitutions


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

Jan 3, 2019

How to turn off the new S/4 HANA Bank account creation workflow

Within all the changes that SAP has done on the Banking area within S/4 HANA, SAP has introduced a workflow that gets triggered every time that a new Bank Account has been created. This is meant for people to approve the Bank Account before it becomes active and ready to be used in the configuration.

Personally I do not see a usage for this functionality for small or medium companies. Maybe there is something that could be useful for really large companies that could have accounts created on a constant basis, but still after that if you want to fully manage that account and issue payments with it; you still need to do quite some configuration to have it working.


In this step we will deactivate the Bank creation workflow that comes now as a new functionality in S/4 that every time you create a new bank account it triggers a workflow that will go for approval before you are able to continue your config for the new bank account.

*** This is a one time setup and it is not needed to be re-done in the future ***




Deactivate “Linkage Activated”



This way when a new Bank Account is created in the Manage Bank Account NWBC App, the workflow will not get triggered. If you do not do that, once you create accounts they will go through a workflow and most likely you will not know who or where it is to be approved. You will have to end up going as WF Admin and force the approval.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.

S/4 HANA Banking configuration / Bank Account creation

Now in S/4 HANA there has been a whole change in the way Banking configuration needs to be done in the system. 
House Banks are not created anymore within Transaction FI12 or accessible through the Payment Program config (FBZP). FI12 has been deprecated. Now you need to use Tcode FI12_HBANK. This can still be transported.

But you can no longer create the Bannk Account IDs or Bank Account numbers through the Payment Program anymore. In S/4 HANA you need to create the Account IDs through "Manage Bank Accounts" with the SAP Netweaver Business Client.
For this, you will need to have assigned in your configuration client to your user the SAP standard role SAP_SFIN_CASH_MANAGER. This will give you access to Bank Account Management.
Once assigned you will trigger it with Tcode NWBC.

*NWBC services might not be active in your environment. For this Basis needs to activate the service in SICF.




You will click in "Manage Bank Accounts"



Then you will click in "New Bank Account" 
Fill in all the required information (*). Same as before S/4 the Bank Key needs to have been previously created in FI01 with all the branch address and data.




Here we put the Co.Code, Name of the Account Holder, Bank country, Bank Key (transit created previously), Account #, account description and account type.
<Save as Active>
<Edit>
<Connectivity Path>
<Add>


CAD is the Account ID that we have assigned to the Account that we are creating.
Also we need to assign the MAIN BANK GL ACCOUNT (The one that normally finishes with 00).



<Save as active>



Then we will see the newly created account in the list of Accounts.

This step needs to be done in your Config Client (Ex DEV-200), but it is not a transportable step. So it needs to be redone in every single environment. For that SAP has an EXPORT process that will generate an XML file with the list of Bank Accounts IDs created that will have to be imported in DEV-210 and in the future in QA and PROD.

It is important to keep that XML file and do not modify it internally. If necessary the XML can be generated again going back to the DEV-200 and repeating the EXPORT process that is coming in the next steps.


Here we will EXPORT and generate the XML file that contains the accounts.



Import Bank Accounts in DEV-210 (also applicable for QA and PROD once we will do that)

Log in DEV-210 and Execute NWBC (make sure you close the other NWBC Explorer window as you can easily get confused between the two environments). Remember that you will need to have assigned in your configuration client to your user the SAP standard role SAP_SFIN_CASH_MANAGER in DEV-210.

You will <Select File> and always play safe by doing "Import with Test Run". This way it will not import but it will check the data and tell you if there are any errors.

If you do not have errors within your newly created Bank Accounts, then import with the “overwrite” option activated. There might be some errors within other accounts that come standard, ignore those errors.


Then you will get results like this after the IMPORT.


Once you have finished your XML IMPORT, your Bank Account ID will be created. To verify this you can check the newly created entries in Table T012K.

** Remember that this XML Import process need to be repeated in every client and environment, therefore you should include this as part of your cut-over / client preparation manual activities.


If your Company and/or Project needs to implement this, or any of the functionalities described in my Blog, or advise about them, do not hesitate to reach out to me and I will be happy to provide you my services.