Moderately sized model painfully slow

0

Hi Everyone,
I am working on a model that to my understanding is moderately sized:

Still Quantrix is painfully slow to calculate any changes made. The model is a sales reporting tool that allows a set of sales information like customer, business unit, segments, sales type, location, price and quantity to be entered for certain financial sets (i.e. August forecast, September forecast, budget, actuals). The information is then agggregated via a select statement into a summary report (compareable to an excel Pivot table) and then the model is supposed to allow for comparison of two data sets’ summary tables (to highlight changes of actuals versus budget by customer, segment, location, etc. for example). In Support of this core functionality there are some Auxiliary and Control tables that provide FX information or lists for constrained input or such things.
The input is already an aggregate by customer, so it’s not like I am dumping thousands of GL lines in the model or something. In total I have about 75 entries per financial set and 4 financial sets. So this doesn’t strike me like something quantrix should have trouble calculating. However, it takes minutes rather than seconds to process even tiny changes. When I switch off Auto-Calculate to get something done and then calculate, it sometimes takes 10-15 Minutes to process the calculations pending. I have 32GB of RAM and allow 24 GB of it for Quantrix. According to the usage statistics it rarely exceeds 8GB of usage and about 20-30% of Processor capacity.
I have almost the same model in Excel and it calculates pretty much in an instant, with the downside that I have to manually update the Pivot tables every time make changes and that items are not dynamic accross financial sets. So I thought this would be “easy prey” for Quantrix, but I cannot figure out how to make it usable given the horrible calculation slowness. I intended to eventually develop a full budgeting suite using Quantrix, but if I cannot even get the sales part of the budget to work, I don’t know how this should run a bunch of other more advanced calculations with a reasonable user experience.
I can make an anonymous version of the model and upload, if that helps. However, I am hoping more for a general answer rather than tips like “minimize the summary functions” and “use select instead of Sumproduct”. I am not looking to only lean the model I am using down to being manageable, I am hoping to find a way to reasonable use Quantrix for much greater things.

Thanks for any support!

RSC Answered question September 19, 2019
0

Hi Rich,
Thanks for your response. Your Youtube Channel has been a great help to me. I sent you an e-mail to arrange a call. I think that would be really helpful.
The issue that blows up the cell count is the PIVOT style tables with multiple categories. I have (including summary line items) 81 customers, 4 currencies, 4 financial sets, 3 probability categories, 4 locations, 12 sales segments, 8 sales types, 6 BU’s over 34 timeline elements (24 months plus quarterly and annual summaries). Multiply that out and you get 304m cells in that table right there. Still it is nothing more than an Excel PIVOT table of a moderately sized data set. Maybe I am too ambitious in what I am trying to do here, but again: I have this in Excel and computation speed was never a problem. (However, usability is quirky in Excel because you have to ensure data consistency between multiple Pivot Tables and then “compare” them via SUMIFS functions etc. to get the deviations between two data sets, etc., which is what I was hoping to solve with Quantrix.)

RSC Answered question September 19, 2019
You are viewing 1 out of 2 answers, click here to view all answers.