Generate dynamic table in Excel from MSSQL with 8 million records

Asked

Viewed 123 times

1

I have a table in MSSQL with currently 8 million records.

I use Pentaho to analyze this data. I’m trying to use Excel as well. If I set it up to take the data and put it right on the dynamic table, it doesn’t restrict it to 1 million records, because it reported that it went past 2 million.

However, it kept consuming 1.5GB of RAM and stopped with error "Excel cannot complete this task with the available resources. Select less data or close the other applications.". The PC has 7.7GB and is using 6GB.

Is there any way he, instead of trying to load all the data to RAM, take only the metadata and go making queries with group by as needed, as Pentaho does?

2 answers

1

Apparently Excel stores the data loaded in memory without any compression. What BI tools do to be able to work with so much data.

I believe it is not possible to perform this operation with Excel, otherwise it would be an excellent substitute for BI tools.

I work with Qlikview and it performs an absurd compression of the data and uses them in memory, to have more agility and always have headache with the staff wanting to use only Excel.

Can Pentaho export tables in Excel format? If I needed a specific table in Excel, using Qlikview, I would generate it in the tool and export it to Excel, thus being able to show the data in the beloved Microsoft tool.

0


I used Powerpivot and with it I was able to load the original Star Schema, instead of using Hyperdenormalization as I was trying to do.

But in the end I preferred to continue using Pentaho to analyze and then export if necessary.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.