performance for large number of records

I am trying to load 3,78,000 records from Excel to Cube which is in datawaresouse. When I try to see data in datacube, everything got fridged…with CPU usage upto 99%.
am I doing something wrong? I am facing issue with Dundas Dashbaord and Reports when it comes to performance, it loads very slow although my cunes are in Datawarehouse. Expecting solution on this isssue.
Thnx,

To see data at the data cube level, that would mean you checked out your data cube. Checked out data cubes behave the same as cubes without a storage type - any data call made is made directly to the source instead of a warehouse. You will have to check your cube back in to build it.

https://www.dundas.com/Support/learning/documentation/cleanse-consolidate-modify-data/data-cube-storage-types

1 Like

@Christian, My cubes are checked in …still I face such issue with Warehouse Datacubes so frequently. Sometimes I hate designing dashboard due to Performance bottlenecks in Dundas BI for large dataset >1,00,000 records with 7-8 columns. Any fix suggested by DBI by now?

Well, performance issues can come down to a bunch of factors. System specs (especially processing power/available memory) of your instance server (assuming your warehouse database is hosted there), but for warehouse builds themselves, database performance, the type of data connector you are using, the amount of transforms in your cube, etc… all come with performance related costs.

https://www.dundas.com/support/learning/documentation/system-requirements/dundas-bi-system-requirements

I should’ve clarified this earlier, but any time you open a data cube, regardless of whether it is checked in or not, the data being loaded in under data preview in the data cube designer will be loaded in directly from the data source, as if no warehouse is in effect. The previous post was relating to opening a metric set/dashboard that relied on the cube.