Friday, September 13, 2019

MDX Insert Intermission

I have not created any posts about MDX Insert lately as I have hit a snag with my research on the CUSTOMCALCANDALLOCTHRUINSERT Essbase config file setting. This setting, in theory, will take an existing ASO procedural calc and automatically run it as an MDX Insert. This would be great for the customers who have a large number of ASO procedural calcs. I get the impression that once Essbase 19c comes out that this will work. So I'll wait until I have better information before doing a full write-up on the topic.

In the meantime I can report some really great news about MDX Insert and my hope for it becoming the next "big thing" for Essbase.

I was working with a colleague on an ASO procedural calc. He was using 11.1.2.3.500 and ran into the "query exceeds 2^64" error which kills the calc. This was due to the large number of combinations in the POV.

The same calc in 11.1.2.4.031 doesn't give an error but it also never finishes. I think a lot of people were turned off to ASO procedural calcs due to this issue.

The great news is that the same calc that died previously, when translated to MDX Insert, took a grand total of two seconds to complete in OAC. Two frikkin' seconds!

I'm seeking other outlines to try this on. If you've got one I can use for testing, drop me a line.

Also, I'll be at Oracle Open World 19 in San Francisco next week. I hope to see some of you there.

Tuesday, August 6, 2019

MDX Insert Part Three



In Part Two of this series I showed how to perform a simple MDX Insert, copying a single cell of data in an ASO cube. In this post I will show how to create a slightly more complicated calculation.

Keep in mind that you need to be using Essbase in OAC for this to work. It will not work in Essbase 11.1.2.4 or prior.

WITH

I'm going to use the MDX 'With' statement in order to create a calculated member. The With statement comes first in an MDX script. You can use it to define an Essbase member on the fly. You can then use that member in other calculations or in the output of an MDX query. To keep things simple and easy to reproduce for those of you following along at home, I will recreate the Variance member in ASOSamp.Basic.

First I added a member in the outline called Stored Variance.



Next I created my MDX statement. The 'With' statement comes first. In this case I called my member on the fly _Variance. I put an underscore in front of members on the fly so they won't conflict with any Essbase members and so that I can quickly see what's being created dynamically in MDX. Of course, if you happen to use underscores to begin member names in your Essbase outline (heaven help you and) find a different method.

The syntax here is hopefully straightforward. You define the member name AS some formula which goes in single quotes. Here I am subtracting [Prev Year] from [Curr Year].





Next comes the INSERT statement. Remember from part two that the syntax is "(source tuple)" TO "(target tuple)".



Then the SELECT statement is added surrounded by parenthesis. In this case I'm putting the _Variance member on the columns and then the Crossjoin of all the level zero members from the remaining dimensions on the rows. Notice that I don't need to bother leaving out shared or dynamic members.



Once I validate and save the script, I can run the job. To do that I go to the jobs screen in the Jet UI.




Then select New Job -> Run MDX.




Finally, I select the MDX script I want to run.



This calculation took 6.9 seconds to run with the standard ASOSamp.Basic data.


Why?

Why might I want to run a calculation like this? In this case it doesn't make a lot of sense. ASO should be able to handle a simple variance calculation with little effort. But what about a rate times volume calculation? Or a foreign currency calculation? Those get much trickier -- see why here.

The Old Way

You could run this calculation with the Execute Allocation MaxL statement. Here's the statement:
 execute allocation process on database ASOSamp.Basic with  
 pov "CROSSJOIN(Except([Measures].LEVELS(0).MEMBERS,{[Avg Units/Transaction],[% of Total]}),  
     CROSSJOIN(Descendants([MTD],LEVELS([Time],0)),    
     CROSSJOIN([Transaction Type].LEVELS(0).MEMBERS,   
     CROSSJOIN([Payment Type].LEVELS(0).MEMBERS,   
     CROSSJOIN([Promotions].LEVELS(0).MEMBERS,   
     CROSSJOIN([Age].LEVELS(0).MEMBERS,   
     CROSSJOIN([Income Level].LEVELS(0).MEMBERS,   
     CROSSJOIN([Products].LEVELS(0).MEMBERS,   
     CROSSJOIN([Stores].LEVELS(0).MEMBERS,[Geography].LEVELS(0).MEMBERS)))))))))"  
 amount "([Curr Year]-[Prev Year])"  
 target ""  
 range "{[Stored Variance]}"  
 spread;  

Notice that I had to add in an Except() statement to remove members with formulas. I also had to change the way I was pulling the Time dimension members because they too had some formulas. When I ran the script I got these warnings:

 [DBNAME: Basic] The argument [POV] ignores the duplicate member [Flat Panel]  
 [DBNAME: Basic] The argument [POV] ignores the duplicate member [HDTV]  
 [DBNAME: Basic] The argument [POV] ignores the duplicate member [Digital Recorders]  
 [DBNAME: Basic] The argument [POV] ignores the duplicate member [Notebooks]  

So I probably should have removed those shared members from the POV as well. What a pain...

This script took 7.9 seconds to run. I understand that 1 second is not a lot of time but in this case the MDX Insert was about 15% faster than the Execute Allocation. I've always seen performance improvement for MDX Insert when comparing the two methods. Over the course of many longer calculations, the performance improvement can be significant.



NON EMPTY

In case you're wondering, I did try adding a NON EMPTY directive before the rows set and it yielded no performance improvement.
 

Appendix

MDX Script for easy copy and paste:
 WITH  
 MEMBER Years.[_Variance] AS  
   '[Curr Year]-[Prev Year]'  
 INSERT  
   "(Years.[_Variance])" TO "(Years.[Stored Variance])"  
 INTO ASOSamp.Basic FROM   
 (  
   SELECT  
    {Years.[_Variance]} ON COLUMNS,  
    {CROSSJOIN([Measures].LEVELS(0).MEMBERS,  
     CROSSJOIN([Time].LEVELS(0).MEMBERS,    
     CROSSJOIN([Transaction Type].LEVELS(0).MEMBERS,   
     CROSSJOIN([Payment Type].LEVELS(0).MEMBERS,   
     CROSSJOIN([Promotions].LEVELS(0).MEMBERS,   
     CROSSJOIN([Age].LEVELS(0).MEMBERS,   
     CROSSJOIN([Income Level].LEVELS(0).MEMBERS,   
     CROSSJOIN([Products].LEVELS(0).MEMBERS,   
     CROSSJOIN([Stores].LEVELS(0).MEMBERS,[Geography].LEVELS(0).MEMBERS)))))))))} ON ROWS  
   FROM ASOSamp.Basic  
 )  





Monday, July 15, 2019

MDX Insert Part Two

In Part One of this series I explained why I think MDX Insert is one of the most important things to happen to Essbase in a while. Now I'm going to show you how to use it. I hope to build from very simple, to very complex over the coming weeks. If you have interesting use cases, send them my way. The more input I get, the more interesting my examples will be.

Keep in mind that you need to be using Essbase in OAC for this to work. It will not work in Essbase 11.1.2.4 or prior.

Documentation

As far as I can tell this is the sum total of documentation on MDX Insert from Oracle:

Tech Reference Spec
MDXINSERTREQUESTTIMEOUT - sets the timeout for an MDX Insert statement
MDXINSERTBUFFERAGGMETHOD - sets whether to add same intersection values or load the last one calculated. Note that it doesn't say if this is an ASO only command so I will assume it isn't until I test it out myself.
Blog One
Blog Two

I'd like to point out that I haven't tried playing around the two essbase.cfg settings above. In OAC it seems you set config settings at an application level and these two settings are not available in the interface (as of 7/14/2019). You can telnet into the OAC server and update the essbase.cfg file. I plan to do that at some point.

Starting Simply

The syntax for MDX Insert is as follows:

INSERT

    source tuple TO target tuple

INTO

    App.Db

FROM

    (

       MDX Select statement goes here

    )


From a high level point of view

1. You are writing an MDX select statement.
2. You're using that to write data to another cell.

Without using an MDX With section, the MDX Insert is only good for copying data from one place to another in a database.

First Example


This example is from the ASOSamp.Basic cube which is available for loading from the Application Gallery in OAC. (If you want to load the sample data you will need to copy that from the gallery into the ASOSamp.Basic file folder then perform a load job -- no load rule is necessary.)

In the MDX script editor in the OAC interface I created the following script.


Here I'm selecting a single cell specified by the following tuple on the rows:
([13118],[056098],[Digital Cameras],[Under 20,000],[20 to 25 Years],[No Promotion],[Cash],[Sale],[Jan],[Original Price])

On the columns I'm selecting [Curr Year]. The column tuple seems to be the source or left side of the equation. I map this into [Prev Year] in the INSERT section.

I can run this query a number of different ways.
1. As a job in the OAC web interface.
2. From Smart View under Manage & Execute Queries. Note that it does not show you the scripts available from ASO cubes. You'd need to copy and paste the MDX Insert statement into the query editor. What's interesting is that if you had created this MDX Insert statement in a BSO cube, Smart View would recognize it as a calc script. So if you were to click the Calculate button, it would show you the classic Essbase calc scripts AND any MDX Insert scripts you had created on the server. I think we need this functionality for ASO.
3. MaxL
4. I assume Rest API (though I haven't tested it)
I don't see a way of using the CLI to run this.

Here's the before:


And the after:


What Does Not Work

From One Into Another

The way the syntax is written, it looks like you should be able to specify different databases for the INTO and SELECT statements. I tried. Sadly it didn't work. I did mention this to someone at Oracle who told me there weren't any immediate plans to add this functionality. It would be pretty cool though. Imagine if you couldn't do that in SQL?

INSERT #MISSING
If I switch the MDX Insert statement above so that my SELECT has [Prev Year] and the INSERT section has "([Prev Year])" TO "([Curr Year])" when Previous Year values, the Curr Year value will not be overwritten. This is the same behavior you would see when using Execute Allocation or Execute Calculation commands with an ASO cube. A load of #MISSING into an ASO cube will not overwrite an existing value. In the past I've used logical clear statements ahead of ASO procedural calcs when I though there might be an issue with overwriting values. For example, if Actuals data needs to overwrite Forecast data.

Under the Covers

I assume what is happening here is that the SELECT statement is generating a dataset. That dataset is then being loaded into the cube as a new slice. When I ran this calc an incremental slice was created. This is what the documentation says on the matter:
During execution of the MDX Insert query, an output buffer is created in memory which accumulates with values until the query is completed.

Appendix

MDX Script for easy copy and paste:
 INSERT "([Curr Year])" TO "([Prev Year])"  
 INTO [ASOSamp].[Basic]  
 FROM (  
   SELECT  
   {[Curr Year]} on COLUMNS,  
   {([13118],[056098],[Digital Cameras],[Under 20,000],[20 to 25 Years],  
    [No Promotion],[Cash],[Sale],[Jan],[Original Price])} on ROWS  
   FROM ASOSamp.Basic  
 )  

Monday, July 8, 2019

MDX Insert Part One


This is important. Really, really, really important. If you’re using Essbase in the cloud you now have the capability of writing MDX Insert statements. (You'll get this capability on-prem with the 19c release) You can write these against both ASO and BSO cubes. MDX inserts are essentially calc scripts. I’m going to spend some time blogging about them in the future weeks and months. I’ll start out with a brief introduction today.

Let’s start with a history of procedural calculations in ASO.

Initially
When ASO was introduced it wasn't good for much. You were really limited in what you could use it for. Basically loading a large amount of data and adding it up. You could add some formulas but they were slow and couldn't be added outside of certain dimensions. If you wanted a procedural calculation done, you needed to make sure that was done before the data was loaded.

And then
I first started reading about ASO procedural calculations around 2011. I gave them a try in early 2012 and failed miserably. The documentation was difficult to understand and they just didn't seem to work for me. I had heard similar things from other consultants at the time so I gave up.

And then
I started working for a large retailer who had a BSO cube with a 3,000 line calc script for running allocations. The BSO cubes (they were partitioned so there was one for each year) first needed to be aggregated, then the allocations were run, and finally the cubes were aggregated again. Total batch time was about four hours and it was killing us. This was a mission critical cube and it wasn't available until 10AM or later some days. So I started to investigate ASO procedural allocations again. I was encouraged by my manager at the time to try them out on this beast. It took a couple of weeks but I got to the point where I had a proof of concept and worked through most of the technical hurdles -- dealing with dynamic and shared members were among the most challenging. Then it was just a matter of spending a couple of weeks heads down translating the calculations from BSO to ASO. I used "Execute Allocation" and "Execute Calculation" commands. I've blogged about these in the past and have presented at a few Kscope conferences on how to use these.

After the dust settled I had taken a four hour batch process across several BSO cubes and moved it to a single ASO cube that took 10 minutes to run from start to finish. On top of that, the users could get these allocations run during the day in less than two minutes. I've been around Essbase 21 years and I have never seen such performance gains. The users were nonplussed.

I'm disappointed when I hear people who are hesitant to try using ASO. I've worked in this space a long time and I've gotten to the point where there is almost no use case where I'd choose BSO over ASO. ASO is so much more scalable. You just need to learn MDX.

And finally
The MDX Insert statement makes life so much simpler. You can use the same language against an ASO and BSO cube. You don't have to worry about dealing with dynamic and shared members the way you did in Execute Calculation and Execute Allocation scripts. You can run them in Smart View. You can write complex calculations. You can write simple calculations. In my initial performance tests, I saw a 30% improvement compared to Execute Calculation and Execute Allocation. The real test, which I intend to do, is to compare BSO and ASO procedural calc performance.
  
This is the next big step for Essbase. It makes it a more mature and "database-like" technology. Essbase now resides within the database group at Oracle. This is starting to make sense. Moving toward a more standard, less proprietary language is definitely a move in the right direction.

And next
I'm going to cover several examples using MDX in the coming weeks. I'll start off simply and build on that. If you have any specific uses cases you're curious about, send them my way and I'll do my best to tackle them.

Monday, July 1, 2019

Kscope19 Reflections

I wanted to take a few minutes to reflect back on what was an exciting, invigorating, and exhausting Kscope19. It started for me on Saturday morning with a trip to the coolest museum on the planet where I programmed an Altair and ended with an Applied OLAP team lunch on Thursday afternoon. I barely had a chance to sleep in between. I'm not going to get into the details on any one thing but want to cover a few important topics that make Kscope such a great conference.

People
This is by far the best aspect of the Kscope conferences. It was great to be a part of the Applied OLAP team this year. We had 10 attendees! Putting faces to names, seeing people I hadn't seen in years (in one case 15 years!) was a real joy for me. The nightly activities including Applied OLAP's night at the Seattle Aquarium and the Wednesday night MoPOP event and after party were a total blast.

Technical Content
Being able to interact and listen to the brightest minds in our field can be done nowhere else. I got a chance to speak with Oracle product management about new features in Essbase. I was able to present on BSO Solve Order and the new MDX Insert statement. These are the kinds of things that get me really fired up. I love helping to push things forward. I was also able to present a demonstration of Essbase parallel queries in our Dodeca product at our Sunday Symposium. This could be a game changing technology allowing multiple Essbase queries to be run by a single user simultaneously. I'll be posting more about this in the future.

Personal Growth
The fact is that presenting at a conference can be difficult. I tend to over prepare which hopefully benefits the people sitting in my presentations as well as me. Preparing a presentation takes a lot of time and hard work. It forces me to be better. I encourage others to try it if they haven't done it before. It is a very valuable skill to have. I'm already starting the process of thinking about abstract submissions for Kscope20. I will see everyone in Boston next year!



Friday, June 7, 2019

A Funeral for the Two Pass Calc

New Session Added! BSO Solve Order!

I'm happy to announce that I'll be holding a funeral for the Two Pass Calc in room 611 on Monday morning at 11AM. This is one of the new 30 minute sessions. I also plan to show off OAC live so come watch me nervously hope that the conference wifi is working. If you're planning to move from On-Prem Essbase to OAC soon you'll want to know all about BSO Solve Order.



My other session is Wednesday at 9AM

I will to be speaking about ASO Tuning and Calculations (but mostly about Calculations). If you've been hesitant to try writing ASO calculations, please come to this session. There is a new way of performing ASO calcs in OAC which I will go over. I'll also be sharing my Top 10 list of things never said by an Essbase developer. Stop by and say hello at 9AM on Wednesday morning in room 609.


Please visit me at the Applied OLAP Booth 

Please stop by and say hello. I'm looking forward to meeting a lot of new people at Kscope this year.

Monday, May 13, 2019

Happy 40th Birthday, VisiCalc!


Visicalc, the first spreadsheet computer program for personal computers, made its debut at the West Coast Computer Faire in San Francisco on May 12, 1979. The innovative spreadsheet grid, still in use today and years ahead of its time was invented by Dan Bricklin. It was described as "a magic sheet of paper that can perform calculations and recalculations". VisiCalc was the original killer app. Many people bought $2,000 computers to run the $100 software. Happy Birthday!






Tuesday, May 7, 2019

Getting Ready for Kscope19



Seattle, here I come!

I'm really excited to be headed to Kscope19 in Seattle next month. I haven't been able to attend the last two years and I have really missed seeing everyone. Right now there are many people working feverishly to finish their presentations in order to bring you the best content available in our industry.




Long Live ASO

I will to be speaking about ASO Tuning and Calculations. If you've been hesitant to try writing ASO calculations, please come to this session. There is a new way of performing ASO calcs in OAC which I will go over. Oh, and I'll be sharing my Top 10 list of things never said by an Essbase developer. Stop by and say hello at 9AM on Wednesday morning in room 609.


Dodeca Rocks!

If you want to find out how Dodeca can help your company, please attend one of our customer sessions. Come hear how Dodeca is making an impact at LexisNexis, Abbott and Kroger. Also, be sure to stop by our booth in the expo hall.


A Journey Across the Kroger Enterprise Essbase Environment from the Grocery Store to the C-Suite Executive 
Come see how Kroger uses Dodeca to deliver Essbase data to 50,000 users!
Monday June 24 11:45 AM - 12:45 PM

Sleepless in Spreadsheets? Essbase…It’s Nice to Meet You!
How does going from ~140 Excel workbooks and ~400 Excel tabs to only 12 Essbase Dodeca views for the whole year sound?
Monday June 24 2:15 PM - 3:15 PM

Streamlining DRM System Change Request Workflows for Oracle EPM Using Dodeca
Hear how LexisNexis cut the time required to process change in their EPM environment in half, while improving quality and ensuring they pass Sarbanes Oxley (SOX) audits.
Tuesday June 25 3:45 PM - 4:45 PM

Trials and Tribulations
Find out how Abbott used Dodeca, a new ASO Essbase cube, and simple SQL to replace two expensive and un-scalable systems with a powerful platform for modeling clinical studies based on changing assumptions and drivers.
Tuesday June 25 3:45 PM - 4:45 PM

Applied OLAP: Dodeca Customer Panel - Dodeca Does It! (Again…and again…and again…)
Listen to testimonials by some of our customers about how Dodeca is changing their organizations for the better.
Tuesday June 25 10:15 AM - 11:15 AM



Sunday, Sunday, Sunday!

I also want to point out our Sunday Dodeca Symposium. This will be held on Sunday afternoon, starting at 1PM in room 609. You will hear about all of the new and exciting features in Dodeca and how they are increasing productivity for our customers.




Applied OLAP Experts Live Talking about things other than our products

Finally, the following sessions are led by technical experts at Applied OLAP with a focus on EPM related technologies.


30-Minute Session: Drill Through Your Essbase and Planning Data with APEX Interactive Reports
Monday June 24 11:00 AM - 11:30 AM

Introduction to Essbase MDX for Reporting
Tuesday June 25 11:45 AM - 12:45 PM

Up & Running with the OAC Essbase REST API
Tuesday June 25 2:15 PM - 3:15 PM

Docker & Essbase: Bring the Cloud to You
Wednesday June 26 10:15 AM - 11:15 AM

Congratulations on Your Shiny New Oracle Analytics Cloud Essbase Account! Now What Do You Do?
Wednesday June 26 9:00 AM - 10:00 AM



Monday, April 1, 2019

Collaborate 19

I wanted to let everyone know that my colleague Jason Jones will be speaking at Collaborate19 in San Antonio on April 7th and 10th. He is giving two great presentations on a couple of really current topics.

The first is Up & Running with the OAC Essbase REST API. This session will help you get started automating your Essbase cloud environment.


Abstract: Oracle's Essbase Cloud offering introduces a new and modern REST API in addition to its traditional Java API. The new REST API opens up many development, integration, and automation capabilities and differs greatly from the traditional API. This session will introduce the REST API, exploring how it's similar to and different from traditional APIs. We’ll also cover some real-world examples and how to get started with development.
Time: 3:00 PM–4:00 PM Apr 7, 2019
Location: CC 2ND FL 214C


The second is Docker & Essbase - Bring the Cloud to You. This session will teach you about using Docker to create an Essbase server. I love using Essbase in Docker. Gone are the days of lengthy and complicated Essbase installations or running slow virtual machines. I use Essbase in Docker everyday and it's improved my productivity tremendously.


Abstract: Docker is a modern technology for rapidly deploying and containerizing servers both on-premise and on the cloud. This presentation will introduce the Docker technology in a practical way, exploring interesting Essbase/EPM use cases in which Docker can be used to open up and enhance infrastructure, development, and testing in EPM organizations of any size.
Time: 4:30 PM–5:30 PM Apr 10, 2019
Location: CC 2ND FL 217D

Thursday, March 21, 2019

Dodeca Essbase Outline Viewer

Dodeca allows you to easily give users the ability to view Essbase outlines. Watch this video for a quick demo then check out Jason Jones' blog about how to set it up.


Wednesday, March 13, 2019

OAC Next Generation Outline Extractor

The Oracle Analytics Cloud (OAC) Next Generation Outline Extractor is now available for download from the Applied OLAP website. I've also added a few tips for using it.

Background

  • For those unaware, Applied OLAP supports a free utility for extracting Essbase outlines. The utility is simple to install and use. I've used it many times over the years. One client of mine used it to extract dimensions so they could be compared to the source relational tables during UAT saving many hours of work.




Version
  • On your OAC interface page, click on your name in the top right corner. Then select About. Download the version of the Outline Extractor that corresponds to your cloud version. In this case the version is 12.2.1.1.114.







Source

  • Using the Essbase API will result in many calls to the server and could be very slow. Instead, on the Input Source choose "Extract and Process MaxL Outline XML".






Connection
  • When making a connection, use the URL that you use in your browser to connect to the OAC interface. After the 'essbase' part of the URL place '/japi' just like you see below. Then enter your Username and Password.


As always, let us know if you have any issues at Support@AppliedOlap.com.

Monday, March 11, 2019

Cascade Reporting in Dodeca



Cascade reports are multiple reports that show a standard template from different points of view. This post will show how to create a Dodeca Cascade Report using an MDX script to choose the intersections.





 Cascade reports are multiple reports that show a standard template from a different points of view. This post will show how to create a Dodeca Cascade Report using an MDX script to choose the intersections.



 I started with a template that shows scenarios and a single roll-up account: EBITDA. This account will be automatically zoomed into to create our final report. Using a roll-up will ensure that any account changes are reflected our report without needing future maintenance.
I also added tokens for Periods, Location, Departments and Years. I added an Excel formula using simple subtraction to find the previous year.

I started with a template that shows scenarios and a single roll-up account: EBITDA. This account will be automatically zoomed-in to create our final report. Using a roll-up will ensure that any account changes are reflected in our report without needing future maintenance.

I also added tokens for Periods, Location, Departments and Years. I added an Excel formula using simple subtraction to find the previous year.


 I added two named ranges. One for the Essbase retrieve and another for the cell on which the zoom-in will be performed.

I added two named ranges. One for the Essbase retrieve and another for the cell on which the zoom-in will be performed.


 I hid the header rows and saved the template.

I hid the header rows and saved the template.


 Next I created an Essbase MDX Script called PaloMartCascade.
I used the Crossjoin() function to create my column headers of Periods and Scenarios.
I also used the Crossjoin() function to pull all level zero members under the user selected Department and Location.
Note the use of the NON EMPTY keyword here. Using this will ensure that no sheet will be generated for any missing or irrelevant intersections.

Next I created an Essbase MDX Script called PaloMartCascade.

I used the Crossjoin() function to create my column headers of Periods and Scenarios.

I also used the Crossjoin() function to pull all level zero members under the user selected Department and Location.

Note the use of the NON EMPTY keyword here. Using this will ensure that no sheet will be generated for any missing or irrelevant intersections.


 Next I set my view properties, specifically for CascadeSources.

Next I set my view properties, specifically for CascadeSources.


 In the Cascade Sources dialog I changed the Source Type to EssbaseMdxScript. Note that this feature is only available in version 7.6 and beyond.
I selected the Script I created in the previous step.
I also set the tokens that will be used to generate the Cascade Report. In this case, the MDX report will return Departments in the first column and Locations in the second column. The order that the tokens are defined should match the order in which they are retrieved in the MDX report.

In the Cascade Sources dialog I changed the Source Type to EssbaseMdxScript. Note that this feature is only available in version 7.6.1 and beyond.

I then selected the Script I created in the previous step.

I also set the tokens that will be used to generate the Cascade Report. In this case, the MDX report will return Departments in the first column and Locations in the second column. The order that the tokens are defined should match the order in which they are retrieved in the MDX report.


 Finally, I added a bit of Workbook Scripting.
First I added a property called SheetCount and set the Value to 0. This will track how many sheets get built during the Cascade.

Finally, I added a bit of Workbook Scripting.

First I added a property called SheetCount and set the Value to 0. This will track how many sheets get built during the Cascade.


 Next I added an EventLink that will be triggered AfterCascadeSheetBuild. This will be fired after each individual sheet is built.
The first method in this procedure is AddProperty. This method uses the @PVal() Dodeca function to get the current SheetCount property and increment it.

Next I added an EventLink that will be triggered by the AfterCascadeSheetBuild. This will be fired after each individual sheet is built.

The first method in this procedure is AddProperty. This method uses the @PVal() Dodeca function to get the current SheetCount property and increment it.


 The next method in the OnAfterCascadeSheetBuild Procedure is EssbaseZoomIn. This is the method that will drill in on the EBITDA account on our template. Note how I reference the zoom.cell in the SelectedRange. If we move the cell at some point we won't have to update this Workbook Script.

The next method in the OnAfterCascadeSheetBuild Procedure is EssbaseZoomIn. This is the method that will drill in on the EBITDA account on our template. Note how I reference the zoom.cell in the SelectedRange. If we move the cell at some point we won't have to update this Workbook Script.


 The next procedure I added was OnAfterBuild. This will be used to show a message box to the user. I use the @PVAL() Dodeca function to get the final SheetCount property.

The next procedure I added was OnAfterBuild. This will be used to show a message box to the user. I use the @PVAL() Dodeca function to get the final SheetCount property.


 Finally, I used the AddProperty method to set the SheetCount back to zero. This will ensure that the property does not keep adding to previous cascades if the view is built again.

Then I used the AddProperty method to set the SheetCount back to zero. This will ensure that the property does not keep adding to previous cascades if the view is built again.


 There you have it. Creating Cascade Reports in fast, flexible and powerful in Dodeca. From here you could add a Workbook Script to save the sheets to individual files or send them in emails.

There you have it. Creating Cascade Reports in fast, flexible and powerful in Dodeca. From here you could add a Workbook Script to save the sheets to individual files or send them in emails.