Updating large amount of data in mysql

Asked

Viewed 718 times

1

I have a table MyISAM with more than 200,000 records.

From time to time I need to do a general update on the content. I create a new table and import ALL releases to the 'second' table on phpMyAdmin. So far so good. After that I run a script that renames the tables:

$sql = "RENAME TABLE primeira TO outra,segunda TO primeira;";

That’s where the problem is, sometimes it locks everything and you need to restart the MySql.

I thought I’d use the alter table, but it seems to me that it is the same thing.

  • 2

    What would be the general upgrade you need to do? Perhaps the best approach is to study your need and look for an alternative. It is not normal to have to update 200,000 records at once

  • A lot of things, from correcting spelling mistakes to creating new matches, doing this online would take a lot of time. My problem is in the name.

  • Correct spelling errors you can do using a Trigger with a function for example. What kind of match? You can give an example?

  • At the level of users, the data is read-only. I add about 200 launches daily. But every 4 months I need to make a general change. Change fields, things like that. Split a color rating into two or three others...

  • I still think the best way would be to study a way to not need to change so many data at once, but as an alternative you can create a view, play to another table the dice, change what you want to change, or play already changing and changes only in view the name of the reference table

  • Thank you Sorack, I will search. Someone help me with the issue of Name' ?

  • I edited your question to remove the greetings, as we usually keep them as clean as possible to focus on your scheduling question. If you are interested in visiting a part of the site that is not aimed to ask questions can know the [chat]. If you have questions about the operation, rules and procedures of the site visit the [meta] :)

  • Has any response helped solve the problem and can address similar questions from other users? If so, make sure to mark the answer as accepted. To do this just click on the left side of it (below the indicator of up and down votes).

Show 3 more comments

2 answers

1

It seems to me that strategy would be to study exactly the changes in the data you perform to be optimized without the need to change 200,000 records at the same time. But with the information that has been provided, perhaps the best approach is to create a view, transpose the data to the other table with the necessary changes and change in the view the base table. So you can query the data in the view hassle-free;

DROP VIEW IF EXISTS view_tabela;
CREATE VIEW view_tabela AS SELECT * FROM tabela_1;

... insere e atualiza na tabela_2

DROP VIEW IF EXISTS view_tabela;
CREATE VIEW view_tabela AS SELECT * FROM tabela_2;
  • Oops ... I’ll study this... thank you.

0

Hello!

I believe I can solve this problem as follows.

1 - Create new table
2 - Import data from Table 1 (old) to the newly created Table

INSERT INTO TABELA2 (COLUMN1, COLUMN2, ...) SELECT COLUMN1, COLUMN2,... FROM TABELA1 ;

I hope I’ve been of some help.

Att,

:)

  • Help yes, thank you. The point is that there are many changes to the new table. I would have two options: 1 - delete Tabela1 before 'pulling' the data in table2 .... users will see an empty database for a few minutes, or, 2 - rename tables......

  • In this case, I believe it would be feasible to create the new table (same structure as the old one), transfer the data to it. Soon after, it performs the table Truncate (clears the records and leaves the forenkey counter as 1) and then makes the modifications. But before performing in production, do a simulation in a test environment or in another Mysql test database. I believe it solves :)

Browser other questions tagged

You are not signed in. Login or sign up in order to post.