1
I have a table in MSSQL with currently 8 million records.
I use Pentaho to analyze this data. I’m trying to use Excel as well. If I set it up to take the data and put it right on the dynamic table, it doesn’t restrict it to 1 million records, because it reported that it went past 2 million.
However, it kept consuming 1.5GB of RAM and stopped with error "Excel cannot complete this task with the available resources. Select less data or close the other applications.". The PC has 7.7GB and is using 6GB.
Is there any way he, instead of trying to load all the data to RAM, take only the metadata and go making queries with group by as needed, as Pentaho does?