Avoid duplicate records in a particular Mysql column

Asked

Viewed 705 times

-1

I have an auction system where several users can bid, but I have a problem when bids occur at the same time, as they are entered in the database with the same value in the column bidding_price, ex:

0.02
0.02
0.02

I set up a select that returns all bids with equal values (duplicity) for a given auction:

SELECT bidding_price, count(*)
    FROM bid_account
    WHERE auction_id = 1335
    GROUP BY bidding_price
    Having Count(*) > 1

In this case it will return the duplicities to the auction with ID 1335.

The structure of the table bid_account this way:

id | int(255) - índice - primária - AUTO INCREMENT
auction_id | int(11) - id do leilão
bidding_price | float(20,2) - valor do lance, ex. 0.02

I came to create a DELETE that can erase the duplicities, however it would be unfair to have the bid deleted by a system error. Follows:

DELETE ba
    FROM bid_account ba JOIN
         (SELECT ba2.auction_id, ba2.bidding_price, MAX(ba2.id) as max_id
          FROM bid_account ba2
          WHERE ba2.auction_id = ?
          GROUP BY ba2.auction_id, ba2.bidding_price
         ) ba2
         ON ba2.auction_id = ba.auction_id AND
            ba2.bidding_price = ba.bidding_price AND
            ba2.max_id > ba.id
WHERE ba.auction_id = ?;

This with ? there why it receives the value by a parameter.

Does anyone have any idea how I can solve this problem by preventing the equal value from being inserted in the table bidding_price, but for a given auction identified by the column auction_id?

In the table there are several equal values in bidding_price, but what cannot occur is equal values for the same auction, that is, for the same value in auction_id.

I thought of creating an update, which checks before if there are duplicities, if there is it updates the values by inserting 1 cent more in each duplicity. But I think I’d go from trouble when there were three moves at the same time.

This SQL command will be executed in Node, and will run in a cron every second, that is, every moment it will be performing this check. If there is a duplicate bid now, in the next second it will identify and correct.

The bidding system works in conjunction with a 15-second countdown timer, whenever someone bids the stopwatch back to its initial score of 15s.

  • 1

    Just create a UNIQUE index and see if it has failed the Insert, and this idea of cron every second is stuck. Better work with a code that supports parallelism and events/signals.

  • 1

    Fast single index and "transactions", treating duplicity error.

  • Unfortunately a problem arose, it would even work but what happens is the following: each bid made it creates a new line that contains the value and the auction ID. If I put UNIQUE to the two columns, how will it insert the next row with the same auction ID? And there are cases where there will be different auctions with the same value. Ai the UNIQUE would not work for any of the columns. Or is there something that can be done to get around this situation?

3 answers

3


All you have to do is use the UNIQUE, such a form that:

ALTER TABLE tabela ADD UNIQUE INDEX(`bidding_price`, `auction_id`);

Thus the pair of bidding_price and auction_id are unique, so if it is {0, 0} there can be no other, but there can still be {0, 1} or {1, 0}. Then for each auction_id only one will exist bidding_price.

You can test this on Sqlfiddle.

  • @Inkeliz... And how do I get him to return some Alert to the customer if there is already a bid with the same value? And if the bidding happens to be held at exactly the same instant what occurs, it does not record any?

  • Got it here, Valew, didn’t think something so simple would solve the problem, congratulations!

  • Unfortunately a problem arose, it would even work but what happens is the following: each bid made it creates a new line that contains the value and the auction ID. If I put UNIQUE for both columns, how will it insert the next row with the same auction ID? And there are cases where there will be different auctions with the same value. Ai the UNIQUE would not work for any of the columns. Or there is something that can be done to get around this situation?

  • The UNIQUE is applied to the pair, not to each separate one, as I explained in the reply. The UNIQUE will prevent an insertion where the two values bidding_price and auction_id are the same. So if I can bid 1 to A and 1 to B and also 2 to A, but I can’t give a new bid 1 to A, because it doubles. This is explained in https://dev.mysql.com/doc/refman/5.7/en/create-table.html, you can do a test by simply giving two INSERT teste VALUES (1,1), (1,0), (0,1), all will work, but if you include a new one (1.1) you will fail...

  • You can test this at http://www.sqlfiddle.com/#! 9/96ea19/1.

  • Wow, heim ball show, it really worked! Takes away one more doubt, as it would be to leave these tables configured this way in the CREATE? That would be the way? PRIMARY KEY id (id), UNIQUE KEY bidding_price (bidding_price,auction_id)

Show 1 more comment

-1

You can check in SQL if there is a bid with the same value using CASE:

CASE  
    WHEN ? IN (SELECT bidding_price FROM bid_account WHERE auction_id = ?) 
    THEN INSERT ...
  • The problem is that the bid is inserted all at the same time, there is no way to check before if there is already some record.

  • The problem then is the chosen language, maybe Node is not the ideal language for what you want to do. Give a look in this video, it is possible to change the size of the thread pool to 1, but the idea of Node is just to do more than one thing at the same time

-3

What matters is who gives the commit first. And that never happens at the same time, within milliseconds. What happens is that you need to increase the storage cache of mysql’s Insert to create a queue. After that, each new request should load the client page or at least play the updated data with the timestamp in a json file and modify the div (or any other tag) dynamically and only the part of the price. You can do this with a Rest or ajax api.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.