Quantcast
Channel: SCN : All Content - SAP Business Warehouse
Viewing all 5981 articles
Browse latest View live

Delete rows from fact table

0
0

Hi,

 

I've done a delete selection from cube X for year less than 2010 (because this infocube has data since 2003). I want to reduce the records number in the infocube, but after delete selection the fact table has the same number of rows (greater than 47 million of records, I check infocube size using program SAP_INFOCUBE_DESIGNS)

 

Why the number of rows in fact table is not reduced? How can I reduce the number of records in order to improve the query access to infocube?

 

thanks in advance,

 

regards,


Basics of Cube Aggregates and Data Rollup

0
0

What are Cube Aggregates?

  Definition

An aggregate is a materialized, summarized and condensed view of the data in an Info Cube. An aggregate maintain the dataset of an Info Cube redundantly and persistently.

  • Summarized and Condensed view refers to the condensing of the fact table of an Info cube to an aggregate table.
  • An aggregate table no longer contains certain characteristics of the Info cube and has been condensed across attributes.

 

When We Create Aggregate on Cube?

Basic purpose of using aggregates is to make data extraction faster.

When we access the data frequently for reporting and we have huge amount of data it takes more time retrieve. If a query is frequently used for reporting and we want performance enhancement then we use aggregates on data source (at Info Cube).

  • Aggregation makes data condensed and summarized so you can access the data of an Info Cube quickly when reporting.
  • New data is loaded at regular interval (a defined time) using logical data packages (requests) in an aggregate. After this transaction, the new data is available for rolling up in reporting.
  • Aggregates are used when we often use navigational attributes in queries or we want aggregation up to specific hierarchy levels for characteristic hierarchies. Both time-dependent attributes and time-dependent hierarchies can be used in aggregates.
  • Note:
  1. To find our queries frequently used and taking more time we can check RSDDSTAT_OLAP table (TCODE SE11).
  2. You can use Table RSDCUBE to determine the Info Cube assigned to the aggregate using aggregate id (6digit).

Prerequisites

  • The Info Cube for which we are creating aggregate must be in active state and there should not be any aggregate with same set of attributes for that Info Cube. Every aggregate must be unique.
  • If you have created aggregates for an Info Cube and entered data for them, the OLAP processor automatically accesses these aggregates. When navigating, the different results are consistent. The aggregate is transparent for the end user.
  • If you want to use system propose aggregates, and then you must create at least one query for the selected Info Cube. The necessary aggregates can be proposed when you start the queries and navigate in them.

Steps to create Aggregate

  • Go to TCODE RSA1 and select the InfoCube in which you want to create aggregate and select Maintain Aggregates option.

1.jpg

  • When first time we create an aggregate for an Info Cube it asks for type of aggregate.

2.jpg

a. Generate proposals

  • The system proposes suitable aggregates. The Specify Statistics Data Evaluation dialog box appears.
  • Enter Run time, from date and to date details & Choose Next.
  • This will bring you to the Maintain Aggregates screen .The system displays the proposed aggregates in the right area of the Aggregates screen. 

b. Create yourself

  • This will bring you to the Maintain Aggregate screen.

3.jpg

    • Note : Aggregates are always build on Characteristics not for key figures .
    • Drag all characteristic you want to include in Aggregate to write side window pane. You can add characteristics one by one as well. Following screen will appear.

    4.jpg

    5.jpg

    6.jpg

    • If you will select later for activation of aggregate following screen will appear where you can give date and time according to your requirement.

    7.jpg

    • After activation aggregate will show you record available for aggregation ,summarized record count ,last roll up ,last used (in query) etc. details .

    8.jpg

    This screen will give you following information also:

    • Hierarchy and hierarchy level fields are used for aggregates on hierarchies.
    • Valuation column will show ‘–‘and ‘+’ sign:
    1. The larger the number of minus signs, the worse is the evaluation of the aggregate, "-----" means: The aggregate can possibly be deleted.
    2. The larger the number of plus signs, the better is the evaluation of the aggregate, "+++++" means: The aggregate could make a lot of sense.
    • Records will tell about number of records in the filled aggregate (size of the aggregate).
    • Records Summarized (mean value) will tell about number of records read from source in order to create a record in the aggregate. This shows quality of the aggregate. Large value show better compression (better quality) .If it is 1 the aggregate is a copy of the Info Cube and should be deleted.
    • Usage shows how often has the aggregate been used for Reporting in queries?
    • Last Used shows when was the aggregate last used for Reporting? If an aggregate has not been used for a long time, it should be deactivated or deleted.

    In order to increase the load performance you can follow the below guidelines:


    1. Delete indexes before loading. This will accelerate the loading process.
    2. Consider increasing the data packet size
    3. Check the unique data records indicator if you require only unique records to be loaded into DSO
    4. Un check or remove the Bex reporting check box if the DSO is not used for reporting
    5. If you are using abap code in the routines then optimize the code. this will increase the load performance
    6. Un check the SID generation check box
    7. Write optimized DSO are recommended for large set of data records since there is no SID generation in case of write optimized DSO.This improves the performance during data load.


    Steps before using an Aggregate

    • To use an aggregate for an Info Cube when executing a query, we must first activate it and then fill it with data.
    • To use an existing aggregate select the aggregate that you want to activate and choose Activate and Fill. The system will create an active version of the aggregate.
    • Once the aggregate is active you must trigger the action to fill the aggregate with data.
    • The active aggregate that is filled with data can be used for reporting. If the aggregate contains data that is to be evaluated by a query then the query data will automatically come from the aggregate.
    • When we create a new aggregate and activate it initial filling of aggregate table is automatically done.

     

    Rolling Up Data into an Aggregate

    a. ROLL UP

      • If new data packages or requests are loaded into the Info Cube, they are not immediately available for Reporting via an aggregate. To provide the aggregate with the new data from the Info Cube, we need to load the data into the aggregate tables at a time which we can set. This process is known as ROLL UP.

       

      b. Steps of rolling up new requests

        • In the Info Cube maintenance you can set how the data packages should be rolled up into the aggregate for each Info Cube.
        • In the context menu of the required Info Cube select Manage. The Info Provider Administration window appears. In the Manage Data Targets screen select tab Rollup.

        9.jpg

          • Here from selection button you can set date and time of rollup.

          10.jpg

            • You can set an event after aggregation and also create a process to run the job periodically. Selective request can be aggregate by providing request id.
            • Request can be rolled up based on no of days.

            11.jpg

              • After rollup you can see the check sign in manage tab. Selecting a request for rollup will also rollup all previous requests loaded before that but not the new one .

              12.jpg

              Levels of Aggregation

              • Aggregation level indicates the degree of detail to which the data of the underlying Info cube is compressed and must be assigned to each component of an aggregate (characteristics, navigation attributes, hierarchies).

              Aggregation level can be

                        By default, the system aggregates according to the values of the selected objects:

              • '*' All characteristic values
              • 'H' Hierarchy level
              • 'F' Fixed value

               

              Thank you for reading this blog .Please add your comments .


              Reference(s)

              1. http://help.sap.com/saphelp_nw04/helpdata/en/7d/eb683cc5e8ca68e10000000a114084/frameset.htm
              2. http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6791e07211d2acb80000e829fbfe/content.htm
              3. http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/906de68a-0a28-2c10-398b-ef0e11ef2022?QuickLink=index&overridelayout=true

              Standart DataSource which extract data from KONP

              0
              0

              Hi All,

               

              Our client needs a new condition type for shipment prices. Then they will enter prices from t-code TK11.

               

              I need to extract price information for this condition from ECC to BW. I see that I can find related information at KONP table.

               

              Is there any standard DataSource which extracts data from KONP. I have already searched from LBWE transaction but I couldn't find any DataSource which extracts data from KONP.

               

              Thanks,

              Oya.

              R3TRXXXXXX was not imported in this step (transport error RC=12)

              0
              0

              Hi all,

               

                         We are trying to import our developments (ABAP Dictionary) objects to a new BW system means the objects we are importing does not yet exist to the target system(BW system).

              we are encountering this error

              h2d.png

              we are using this import options

               

              importoptions_h2d.png

               

              ddic_h2d.png

               

              is there any notes for this one?

               

               

              Kind Regrads,

              Jose Mich

              Customer Exit From 0calmonth to 0calday.

              0
              0

              Hi All,

               

              I have created a customer exit for 0calday and a manual input variable for 0calmonth.

              My requirement is if I enter 0calmonth  for ex:11.2010 as a parameter then

              I should get the data for  26.10.2010 to 25.11.2010. i.e., previous month 26 to current month 25.

              Is it possible?

              Are there any standard function modules for this kind of combination?

              If possible how do I proceed.

              Tried a few combinations but did not work.

              I am working on BW 7.3

               

              Thanks.

              Key figure value gets doubled from ods to cube

              0
              0

              Hi all,

               

              I have the ods 0FC_DS05 which is item detailed.Upon this ods, I have a custom cube and there is no routine in the transformation.Each time, I delete the data in ods and cube and then full update for both.But for some documents and items(not all), the value of the amount key figure gets doubled.There are many threads about this case but none of them that I looked at solved my problem.

               

              What might be the reason?When I extract only the documents causing the problem, I see that the number of records are as expected.Only the value is doubled in my opinion.

              logic to populate request extraction data in dso

              0
              0

              Hi,

              I want to populate date in the dso on which particular request was extracted from ECC.

              Means I want to populate Infopackage request extraction date in a dso.

               

              Kindly provide me the logic for the same.

               

              Thanks,

              Neetu

              I added 3 characterstics in ODS(In 3.x data flow only but server upgraded to 7.3) and for 3 fields data populated from other DSO.

              0
              0

              HI Friends,

               

              In ods i added 3 fields (This ods contains huge no. of records in production) which is in 3.x data flow but the bw server is upgraded to 7.3.

               

              For above 3 additional characteristics data is populated from lookup dso by update routine.

               

              Ideally before transporting changes into production  we have to delete data from ods and import the request.

               

              But here data is 70,00,00,000 of records will take more time. If we will import the request same it may cause to system down. thats why worrying.

               

              But I want to import the request without deleting the data in ods(Management requests).

               

              Please advice how to do with data importing the request and give me any alternative solution.

               

              Thanks & Regards,

              Srinu.Rapolu


              Data Load Performance

              0
              0

              Hi

               

              We're having trouble with getting large volumes of data into some of our cubes.

               

              1 of our cubes (Listing) takes around 01h45 to load about 20 million records.

              Another cube (Vendor) takes around 00h30 to load 36 million records.

               

              Most of the time, in both scenarios, is taken up with SID generation ... is there a way to speed this up?

               

               

              In the Listing scenario, the load to PSA has been split into about 15 infoPackages. The total time taken to run all of those infopackages (in 3 parallel streams) is about 15 minutes.

               

              In the Vendor scenario, a single infopackage is used to load to the PSA, and takes around 2 hours to load into the PSA

               

               

              The following picture shows one of the DTP requests for the Listing scenario:

              Listing DTP.jpg

              For this example 3:16 for datapackage 8, of which 2:29 were taken up for the SID step.

               

              Not sure why this takes so long, and if there's anything to do to speed it up.

               

              A similar screenshot of the Vendor DTP:

              DTP Vendor Start.jpg

              That looks like similar performance to the listing DTP. However, when I look at the later requests in the Vendor DTP, they suddenly go a whole lot quicker towards the end:

               

              DTP Vendor Late.jpg

               

               

              I guess my questions are:

              1. How do I speed up the SID generation

              2. Why does SID generation get quicker when more & more records are processed, and why would that not apply in the Listing scenario when there are multiple PSA requests to load

              3. Why would an infopackage take so much longer (36 million / 1.5 hours) when running as single infopackage compared to smaller packages (15 minutes in parallel / 20 million records)  --> This one I can sort of understand, that the multiple smaller packets are better, but it then seems to harm the DTP performance.

               

              Cheers,

              Andrew

              0BWTCT_PPM Planning Process Management (PPM)

              0
              0

              Hi everybody,

               

              there is some technical Business Content for Integrated Planning in InfoArea "0BWTCT_PPM - Planning Process Management (PPM)".

              How can this be used?

              Is this comparable to the "Status & Tracking System (STS)" in BPS?

              Unfortunately I've nothing found in SAP Help and SAP Support Portal.

              Thanks in advance for enlighten me.

               

              Kind regards,

              Chris

              Report

              0
              0

              Hi

               

              Can anyone help on this.

               

              I have different Accounts with different condition types Z1,Z2,Z3,Z4

               

              I want a report with the accounts which doesnt have condition type Z1.

               

              Account1 has Z1,Z2,Z3,Z4

              Account2 has Z2,Z3,Z4

              Account3 has Z1,Z2,Z3,Z4

              Account4 has Z2,Z3,Z4,Z5

               

              report should display

               

              Account2

              Account3

               

              Thanks

              Data uploading error

              0
              0

              Dear Experts,

               

               

              I am missing some steps for creating BW & R/3 evaluation Servers , for upgrade testing  I am getting error "IDocs added error" as given in screen shot.

               

               

              Regards,

               

              Anand Mehrotra.

              DSO activation Fails " Chacteristic value is not MAT N-1 Converted

              0
              0

              Hi Experts,

               

              We are facing error in DSO activation.

              Activation is failing for a couple of materials and the exact error message it throws is as below

               

              Characteristic value '21062868' of characteristic 0MATERIAL is not MATN1-converted.

               

              We checked the conversion routine and could not find any issues there.

              Moreover we also checked SID table for the 0material and found that the SIDs are getting generated for this materials ,still it throws error.

               

              We have checked this material in R/3 (source for this DSO) material master and found it exists there .

               

              Please help to resolve this .

               

              Thanks.

               

              Nayab

              how to enhance 0PCP_RES_TEXT datasource

              0
              0

              Dear Gurus,

               

              how to enhance 0PCP_RES_TEXT datasource.

               

              item category  is ‘M’ then populate the text from MARA table and

              if item category is E populate the text from activity type (CSLT) table.

               

              Please help

              Data source transports in BW

              0
              0

              Hi All,

               

              From ECC Dev to ECC Quality i moved TR and success, after that i replicated Data source from ECC Q to BW Q went fine. after that i am unable actiavte data source in BWQ. active icon was disabled, but by using program i can manage. but just need to know is this normal behaviour?

               

              Next i planned to move BW D data source to BWQ, this may over writes and active version? is this way works?? i have further TRs which are Cube, transformations, info pak and DTP.

               

              About objects no problm all r existed in all environments.

               

              am using BW 7.3. am bit kind new to TR.

               

              Thanks


              Last Delta Incorrect, No new Delta update possible.

              0
              0

              Hi Experts,

               

              While loading data in PSA for 0OGL_ACCOUNT_ATTR, I am getting below error message when I trigger IP. I am in BI 7.3

               

              111.PNG

              In RSRV, I am getting below message

               

              Capt444ure.PNG

               

              There was a red request at PSA Manage, I deleted it, after this problem occurring. Please guide me on this to fix.

               

              Thanks,

              Saravanan.

              How to install the object which is not present in BI Content

              0
              0

              Hi all,

               

              I tried to install Infoobject '0AD_AUTHID' from BI Content. Later I realised that particular object what I am looking for is not present in the BI Content. I need to install this object in order to go ahead in my project. Could anyone please explain me or let me know how I can bring this object into BI Content and then Install.

               

              Thanks,

              Prince.

              Insert a query in a Role Menu

              0
              0

              Dear Gurus,

               

              I'm trying to save a query (not a workbook) in a Role Menu, but I can't do it. This is how I'm proceding:

               

              I go to PFCG - Menu Tab - I create a foler - I insert the technical name of the report that I want to save in the Role Menu. 

               

              (In fact, I want to save it in an existing folder called SAP_BW_USER_MENU, but I cannot see it in the Menu Tab)

               

              When I open Bex Analyzer and clic to Roles, I cannot see the folder that I hace created and, of coursel, I cannot see the query.

               

              Can anyone help with this?

               

              Many thanks in advance.

               

              Regards,

               

              Jolgorio

              Display and Retrieve only Overall Results in Bex Query

              0
              0

              Dear SAP BI members,

               

              My project is using a BO Dashboard for reporting and source for it is BW Bex Query. However, the structure of query is such that Calendar Year/Week is present in columns which is defined with offset to display last 12 week data. And there are 5 keyfigures apart from 1 characteristic in row. The overall result row is displayed at top and calculated as "Counter of all values". The overall output of query is such that it generates lot of cells and gives error - " Result set too large (606762 cells); data retrieval restricted by configuration (maximum = 500000 cells) ". Unfortunately I cannot change the BO part and I have to solve this problem in BW query only. Another point to be noted here is that BO is using only overall result row which is configured to display at top of the report. So basically BO is interested only in top 3 rows (Calweek, keyfigure description, overall result) and rest all data is irrelevant for it. So I now need to find out a way by which I will display only overall results for the query and somehow hide rest of the row/column data. I also tried option of "Calculate single value As - Hide" for all keyfigures but it is still generating same number of empty cells and giving the same above error. So basically this option did not work for me. Is there any other way to solve this problem? Your help is very appreciated. Thanks.

               

              SDN.PNG

               

              -Abhijit

              Error in DSO 0CRM_IBCO activation

              0
              0

              Hello Experts ,

               

              I am facing below error while activating contents of DSO 0CRM_IBCO.

               

              Value '!5THROYAPPA ROAD, 1ST FLOOR,' (hex. '210035005400480052004F005900410050005000410020.

               

              We have written an end routine in transformation to remove special characters like "!". But the problem I observed while debugging is system is not expecting the entire string.As a result I am getting blank screen in debugger which obviously out of question for removing special characters. Can anybody help me here. I suspect , due to "!" at first place in string , system is unable to recognise the string.

               

              -Mandar.

              Viewing all 5981 articles
              Browse latest View live




              Latest Images