efficient way to transfer 18MB of data

Discussion in 'MySQL' started by rvijay17, Feb 8, 2006.

  1. rvijay17

    rvijay17 New Member

    Feb 8, 2006
    Likes Received:
    Trophy Points:

    I have an application where I get information about products everyday.
    The file's size is about 18MB. Everyday, products may be added, updated
    or deleted.

    Which is more efficient way to update teh database,
    1. delete all rows and then again insert data again.
    2. programatically check which product has changed and only update

    I have to use PHP. Each line in the file is about a product and each
    field is tab separated. Any one of the field may change. So, I would
    have to check each and every field which incurs one more problem, data
    has to be transfered in and out of the database server.

    So I wanted to know which one of the method is more efficient.

  2. coderzone

    coderzone Super Moderator

    Jul 25, 2004
    Likes Received:
    Trophy Points:
    The debate can continue for ever as to which is better deleting all rows and inserting new data or the searching which rowschanged and update the following but here are some pros and cons of each method and after this its upto you to decide which pros I can do with and use that method.

    1. Deleting all the rows.
    1. No programatically checking as to which data is updated. No select statements.
    1. All data is deleted momentarily and so database is in very unstable state though for a short period of time and if something happens in between the delete and update lots of loss of data.
    2. Looks like this can be achieved much faster but as we are deleting all the content of the tables and re-inserting them it can take a lots of time.
    3. As we are deleting all the data's we may also need to drop the table so as to keep the auto indexing of many fields to be intact or else we will be just increasing the value of the indexes.

    2. Updating selected rows.
    1. Less update/Delete operations on the database allowing it to have more consistency
    1. More code to test which rows have been updated making more chances of bugs but if tested thoroughly this can be avoided.
    2. Each and evey row must be retrieved and compared with the content giving more overhead.

    When the size of the table increases the its always better to go with the option no 2 and so the final conclusion that comes to my mind is 2. Updating selected rows but it might so happen that in your situation 1. Deleting all the rows may be better one. So its you who can better decide which is better.

    I hope this solves your query or at least helps you in that direction.
    Last edited: Feb 8, 2006

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice