Size of a Mysql table with 200 million records

Asked

Viewed 1,335 times

0

I am developing a system for a client, and I came across the following situation: he gave me a database, with phone numbers. And I have to create a routine, to generate new numbers for future queries.

The table had received about 200 million of records, with 3 fields ID - NÚMERO E CÓDIGO - my fear is if the table in MYSQL will endure smoothly.

The system will be running for 4 days to generate all the data, I started this afternoon, already with 15 million of records and with almost 600 megas.

The table will not have multiple simultaneous accesses, it will be only to generate a data report with the same code!

  • 1

    I thought about it, the problem and the next, I have to know the phone number and of a certain operator, and so pass to the customer, how many numbers he has from a certain operator, and a certain DDD. I thought it would be only when performing the query, so far so good, there would be no need to create a basis for query, but the customer wants to be displayed the amount of agreement as described above

  • Yeah, I can take the ID off, but I just remembered I’m gonna need the D.O.D.

  • I have worked on a system that had a table in Mysql that one day needed an emergency change because the size of the key, which was int(10).

  • @It is because its format had been copied from another system. But after that, I just made sure that it is better to sin by exaggeration than by default. We put int(18) there and the system remains happy until today.

  • It takes about 5 seconds to get a prefix with DDD, then generate 10 thousand numbers per prefix and write to the BD, after everything is ready and I will process the carrier code according to the generated number, 5 seconds X 52 thousand prefixes, by my badly done accounts I think it would give about 3 days and a few hours !!!

  • Mark, are your columns indexed? By Area Code and number (in this order)? Because if they’re not, you’re screwed.

  • In fact I thought better and I will leave the integer number ex 11999999999 - these numbers will be updated every 5 months, but I think I will study a way to change only those that appear in the portability list, so you do not need to process everything again.

  • 1

    @Good Bacco, will already be the solution to the next step!!

  • Dude, is there a possibility to use SQL Server Express 2016? It is free tbm, the only limit is that your bank will not be able to exceed 10 GB

Show 4 more comments

1 answer

3


According to this reply of the OS in English:

Physical size of BD does not matter. Number of lines does not matter.

In my experience, the biggest problem Voce will have is not the size, but the number of queries it can withstand at a given time.

My BD has more than 10GB and with a moderate number of connections, no problem with the requirements.

This other reply of the OS in English, the user says he had tables with 200-300 million records, with size above 1TB without problems. It’s a question of knowing how to execute.

Long story short, no, no problem. But I strongly recommend you give a good read on the documentation, on these responses, work with indices, sometimes talk to an experienced DBA.

  • Valew Leofontes, so I am calmer, when the requests I will not have many problems, even pq and u closed system, and few people will make the query, more surely I have to think of a future system migration, and thus already create the lean structure so as not to have future problems!!

Browser other questions tagged

You are not signed in. Login or sign up in order to post.