0
The thing is, I’m having concerns about a database being used in a project. The project consists of the development of a software to unite the respective data to two solar spectrum sensors, and then the storage of them in a database together with other information from a "master" table. The concept is that for each row in the master table there will be thousands of rows in the spectrum table.
SUMMARY
We have the following diagram:
The problem I’m facing is that the master table rows will be added 24/7 once a minute, and the esp table rows will be inserted more than a thousand at a time, but only 5 times a day, with an interval of one hour.
PROBLEM
It may have been a bit confusing, but the major concern is not understanding this diagram.
What I want to do is calculate the average volume of data that will enter the bank per year/month/week. there is some formula tool or way to calculate how much space I will need to store this data?
thank you in advance!
This changes from manufacturer to manufacturer. Specify a tag or in the text which is the DBMS you are using. Also specify the indexes because they also take up space.
– Pagotti
But the manufacturer factor will not be taken into account much, because after reading the sensor I am formatting the data in python. After all I want to estimate the storage for all fields to be double, so I would have a break and then calculate the correct difference. I don’t understand what you mean by index, sorry for the lack of experience with database
– Breno Baiardi
The index part is because each index occupies space in the database and increases as the number of rows in the table increases. Knowing the manufacturer is also important, for example, the query of the answer that was posted works on Oracle but not on MS SQL Server.
– Pagotti
thought the manufacturer mentioned would be the sensor, so said it would make no difference. In case I’m using postgresql.
– Breno Baiardi