There's times when you want to model an infoprovider, but you don't really want to store any data contents into it. Well, for that, the virtual remote cube comes in handy. This 'beast' can be used to 'act like an infocube, feel like an infocube, but is not an infocube', You can create this virtual infoprovider, having the same structure similar to infocube, with dimensions, characteristics and key figures. But the actual generation of data for this cube, is, well depends on your making.
You can choose to create a virtual provider with a 'direct access' to R/3. This means that upon viewing the data in this virtual provider, a remote function call is called direct to R/3 to get the values on the fly. You can also choose to get value from a function module of your calling, so the implementation details is really up to you. Now that's really flexible ain't it?
So how do you create this? In BI7, just right click on any infoarea, and choose create 'virtual provider'. For direct access, you choose Virtual Provider with 'direct access'. There's an option to use the old R/3 infosource. What this means is that you can create an infosource with transfer rules and such, and when you create the virtual infoprovider, it will use the structure in the infosource automatically and the flow of data is automatically link R/3 with the transfer rules and the virtual infoprovider. Note that no update rules exist with this kind of setup.
But in Bi7, you have a more flexible approach in that you can create a transformation to configure the 'transfer logic' of your virtual infoprovider, along with start routine, end routine or any other transformation technique visible with using a transformation rule.
But using BI7 setup, you need to create a sort of 'pseudo' DTP, which doesn't actually do anything, meaning you do not 'execute' it to start a data transfer.
After all is done, you need to right click on the virtual infoprovider and choose 'Activate Direct Access'. If you use Infosource, go to the Infosource tab, and choose the infosource. If you're using BI7 setup, choose the DTP related to the transformation and save it.
Now your virtual provider is ready to be used!
A word of advise though, a virtual infoprovider with direct access is really slow if you have a lot of records transmitting between the remote function calls. So, thread wisely!
Saturday, December 20, 2008
1
Sunday, December 07, 2008
0
BW IDOC Status
Sometimes, when we load data to BW, it's important to know the idoc status of the loads. There are times when the load is stuck and it's good to know the reason why.
Anyway, here's a list of some of the known IDOC status:
Anyway, here's a list of some of the known IDOC status:
00 Not used, only R/2 |
01 IDoc generated |
02 Error passing data to port |
03 Data passed to port OK |
04 Error within control information of EDI subsystem |
05 Error during translation |
06 Translation OK |
07 Error during syntax check |
08 Syntax check OK |
09 Error during interchange handling |
10 Interchange handling OK |
11 Error during dispatch |
12 Dispatch OK |
13 Retransmission OK |
14 Interchange Acknowledgement positive |
15 Interchange Acknowledgement negative |
16 Functional Acknowledgement positive |
17 Functional Acknowledgement negative |
18 Triggering EDI subsystem OK |
19 Data transfer for test OK |
20 Error triggering EDI subsystem |
21 Error passing data for test |
22 Dispatch OK, acknowledgement still due |
23 Error during retransmission |
24 Control information of EDI subsystem OK |
25 Processing despite syntax error (outbound) |
26 Error during syntax check of IDoc (outbound) |
27 Error in dispatch level (ALE service) |
28 Not used |
29 Error in ALE service |
30 IDoc ready for dispatch (ALE service) |
31 Error - no further processing |
32 IDoc was edited |
33 Original of an IDoc which was edited |
34 Error in control record of IDoc |
35 IDoc reloaded from archive |
36 Electronic signature not performed (timeout) |
37 IDoc added incorrectly |
38 IDoc archived |
39 IDoc is in the target system (ALE service) |
40 Application document not created in target system |
41 Application document created in target system |
42 IDoc was created by test transaction |
50 IDoc added |
51 Application document not posted |
52 Application document not fully posted |
53 Application document posted |
54 Error during formal application check |
55 Formal application check OK |
56 IDoc with errors added |
57 Test IDoc: Error during application check |
58 IDoc copy from R/2 connection |
59 Not used |
60 Error during syntax check of IDoc (inbound) |
61 Processing despite syntax error (inbound) |
62 IDoc passed to application |
63 Error passing IDoc to application |
64 IDoc ready to be transferred to application |
65 Error in ALE service |
66 IDoc is waiting for predecessor IDoc(serialization) |
67 Not used |
68 Error - no further processing |
69 IDoc was edited |
70 Original of an IDoc which was edited |
71 IDoc reloaded from archive |
72 Not used, only R/2 |
73 IDoc archived |
74 IDoc was created by test transaction |
Thursday, November 13, 2008
0
Day of Last Period ABAP Function
Sometimes, in the course of your BW work, you will encounter a situation where the business requirement is to get the day of the last fiscal period. This is due to the fact that you want to populate, say calday infoobject, but the datasource only contains information in FISCPER format like 002.2008 for February 2008 for example.
So how do you overcome this? How to derive the last day in that period? Well the easy way is to use this built-in ABAP function in your transformation rules / or transfer rules etc:
So how to use this function? You need to replace the following fields which is highlighted in bold above, 'your_fiscal_year', and 'your_fiscal_month', should be replaced by the fiscal year and fiscal month respectively, in this case, you can derive this from fiscper from datasource. Remember, that 'your_fiscal_month', must be in 'financial format', meaning that its length is 3 and not 2 as usual. Remember to also populate the 'your_fiscal_variant', without this, the program will return with an exception error most likely.
This function will return 'output_last_day_of_period', and you can use this value to populate your calday in the transformation rules / transfer rules as you like!
Presto! That's easy isn't it?
So how do you overcome this? How to derive the last day in that period? Well the easy way is to use this built-in ABAP function in your transformation rules / or transfer rules etc:
CALL FUNCTION 'LAST_DAY_IN_PERIOD_GET'
EXPORTING
I_GJAHR = your_fiscal_year
I_MONMIT = 00
I_PERIV = your_fiscal_variant
I_POPER = your_fiscal_month
IMPORTING
E_DATE = output_last_day_of_period
EXCEPTIONS
INPUT_FALSE = 1
T009_NOTFOUND = 2
T009B_NOTFOUND = 3
OTHERS = 4
.
IF SY-SUBRC <> 0.
* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
result = output_last_day_of_period.
So how to use this function? You need to replace the following fields which is highlighted in bold above, 'your_fiscal_year', and 'your_fiscal_month', should be replaced by the fiscal year and fiscal month respectively, in this case, you can derive this from fiscper from datasource. Remember, that 'your_fiscal_month', must be in 'financial format', meaning that its length is 3 and not 2 as usual. Remember to also populate the 'your_fiscal_variant', without this, the program will return with an exception error most likely.
This function will return 'output_last_day_of_period', and you can use this value to populate your calday in the transformation rules / transfer rules as you like!
Presto! That's easy isn't it?
Tuesday, November 11, 2008
0
SAP BI 7 Features
It's been a while since my last post. Anyhow, today I would like to explain a bit about SAP BI 7.0 so called new features.
- New Transformation Rules - You heard that right, ladies and gentlemen, no more deciding whether this routine should be in update rules or transfer rules, now you can lump all those into one transformation rules which combine the best of both worlds. With the new 'end routine' and 'expert routine' features, you can further customize how data transformation is done.
- New DTP (Data Transfer Process Layer) - This is an additional layer on top of the 'normal' infopackage. So now, infopackage will move data to PSA level only. You will then use this DTP thingy, to move the data further to BW data targets, like infocube, ODS (they called this DSO now) or to infoobject. This is good in that you can debug more, but it will add additional layer especially for loading master data or text, which is in my opinion works better with the 'direct update' thingy in the previous version. But maybe SAP direction is in 'standardization', so no more differentiation between Flexible Update Rules or Direct Update I guess.
- New Remodelling Tool - So far I haven't used this yet, but they say, you can easily modify an existing infocube structure (with data in it) using this tool. Whether it will affect the data integrity of the infocube is another question.
Labels:
bi7
Monday, September 25, 2006
2
Dang Users
Dang Users! After many months, our users finally decided not to use SEM-BCS.
Reason stated, it's too complicated, and too inflexible!
Oh well, time to move on to other things.
Reason stated, it's too complicated, and too inflexible!
Oh well, time to move on to other things.
Saturday, April 01, 2006
7
SAP BCS - The experience...
Recently, we did some preparation for some SAP SEM BCS prototyping. All I can say is wow. A lot of detailed accounting setup need to be made on paper even before we can do the necessary configuration.
So how do you start? Well first, you need to prepare an excel worksheet, with the necessary consolidation simulation data. You need to create a T-account thingy, balance the account and such.
Using this excel worksheet, we start configuring the BCS system. First, (like my previous post), you configure the master data, FSI Item and such in the Consolidation Workbench (UCWB).
After that, you use the Consolidation Monitor to enter your Investment and Equity data (AFD - Additional Financial Data). This includes the percentage of ownership, etc etc of the various company (consolidation unit).
After completing the exercise, we use the Flexible Upload Method, to load data into the BCS infocube. Mind you, this flexible upload is slightly different than the one we used in BW. In fact, I think it's more flexible and much easier compared to the one in the BW Infopackage.
Why? Well first, instead of configuring the mapping of the flat files via Transfer Structure, there's a much better interface in the Consolidation Workbench under Consolidation Functions -> Flexible Upload. Here, you can define the header structure of the flatfiles, and the data rows.
You can set the comment character, in this case, I use the default *. By having comments in your flat files, you can create a standard template that your users can understand and easily use.
Now, back to the consolidation monitor, we upload the Total Records Data via Flexible Upload for each company (consolidation unit). After doing this, we run validation to see, whether the Reporting Financial Data is 'balanced' and such.
Well that's the current step that we've gone through. I'll update more when we progress to the other stages of SEM BCS.
So how do you start? Well first, you need to prepare an excel worksheet, with the necessary consolidation simulation data. You need to create a T-account thingy, balance the account and such.
Using this excel worksheet, we start configuring the BCS system. First, (like my previous post), you configure the master data, FSI Item and such in the Consolidation Workbench (UCWB).
After that, you use the Consolidation Monitor to enter your Investment and Equity data (AFD - Additional Financial Data). This includes the percentage of ownership, etc etc of the various company (consolidation unit).
After completing the exercise, we use the Flexible Upload Method, to load data into the BCS infocube. Mind you, this flexible upload is slightly different than the one we used in BW. In fact, I think it's more flexible and much easier compared to the one in the BW Infopackage.
Why? Well first, instead of configuring the mapping of the flat files via Transfer Structure, there's a much better interface in the Consolidation Workbench under Consolidation Functions -> Flexible Upload. Here, you can define the header structure of the flatfiles, and the data rows.
You can set the comment character, in this case, I use the default *. By having comments in your flat files, you can create a standard template that your users can understand and easily use.
Now, back to the consolidation monitor, we upload the Total Records Data via Flexible Upload for each company (consolidation unit). After doing this, we run validation to see, whether the Reporting Financial Data is 'balanced' and such.
Well that's the current step that we've gone through. I'll update more when we progress to the other stages of SEM BCS.
Labels:
sem-bcs
Friday, March 17, 2006
4
SAP BW Authorization
SAP BW Authorization is definitely different from R/3 authorization. Why? Well, first, R/3 authorization usually involves up to the transaction code level. But for SAP BW, the mostly used transaction is "RSA1" and "RRMX". Therefore, authorization based on transaction code alone, is definitely not sufficient.
So how do we design authorization in SAP BW? There's a few authorization objects that relates to SAP BW.
For reporting, you will most probably use the following SAP BW authorization object:
So how do we design authorization in SAP BW? There's a few authorization objects that relates to SAP BW.
For reporting, you will most probably use the following SAP BW authorization object:
- S_RS_COMP - Reporting Component, here is where you control the query authorization blah blah.
- S_RS_COMP1 - Reporting Component Owner, you can control users to only be able to access report created by Power Users, here.
- S_RS_FOLD - Disable/Enable the 'InfoAreas' button.
- S_RS_ICUBE - Infocube authorization
- S_RS_ODSO - ODS Objects
- S_RS_HIER - Hierarchy Authorization
- S_RS_ADMWB - Administrator Workbench
- S_RS_IOBJ - Info Objects authorization
- S_RS_ISOURCE - Transaction Infosource
- S_RS_ISRCM - Master Data Infosource
There that's what you need for authorization. Anyway, to achieve "field level" authorization like those in R/3, you can create a customize object, select the infoobject that has been set "authorization relevant", and add it in the authorization matrix, and walla, you got "field level" authorization.
Labels:
authorization