So, I rarely ask questions, I usually find the answer myself, but, I'm just tired today and maybe my fellow dev ranters can help here.

Is there a way to write a bash script that automatically runs a pre-defined mysql query on local DB and throws that up into production DB?

it's just a simple select and then insert.

Been looking at select into out file, from infile etc

I could do it manually, but I'm going to have to do this more than a few times over the next few weeks and would love to just alias it and be done with it

doesn't seem possible without actually logging into mysql cli each time...on both sides, which sucks

this isn't export/import a table, it's export/import the result of a query


  • 3
    I don’t know if it’s possible in bash. I’d probably do it in python
  • 1
    @not-sure yeah.... I guess, lol

  • 3
    @rant1ng assuming you’re using MySQL, you can call it as below

    mysql -u <user> -p -e "select * from schema.table"

    Run whatever you like so long as the user account you use has the appropriate perms.

    You can select the result into a variable in a bash script, or if you have too many values for a single variable, just dump the result into a text file but I suspect you will probably be fine with a standard var.

    This is all entirely possible with bash. It’s been a while since I’ve done this but I have written a great many scripts for db maintenance tasks using this approach.
  • 1
    @oudalally man I hope so.. I'll give this a try later and let you know
  • 4
    Yes all that is definitely possible in Bash. I've written update queries in bash and replace queries etc.

    You can run queries from command line or script no problem as stated above.

    Except you will either have to have a .my.cnf with password or add password for mysql in script if you use just -p it will prompt you for password. So use -p<password>
  • 0
    @oudalally @bitwise

    Just like I suspected, this task took like a whole day.

    I was able to export the query into a tab delimited file, but I couldn't find a way to import that same data as inserted new rows, so I gave up

    tried to create a console command that would do it, got some weird error where like, half of my data disappears or does'nt get imported

    so I'm giving up on that method and going back to importing from tab delimitmed file, with first row being the column labels

    what's the command for that?

    I tried for literally almost 2 days now to do this myself, lol time to throw in the towel
  • 1
    I thought i added the comment to the wrong thread. Swear i read asking about delimited file

    In php all you have to do is array_flip the first row in csv file like so

    $values = array();
    if (($fp = fopen($abspath, 'r')) !== FALSE) { while (($data = fgetcsv($fp, 4096, ",")) { if ($row == 1) { $keys = $data; } else { $values[] = $data; }

    $filtered = array();
    for ($i = 0; $i < count($values); $i++;) { $filtered[$i] = array_combine($keys, $values[$i]); }

    Them $filtered will be a multi dim array of first row as keys and the data so to get data all you need to do is for($filtered as $data) {
    $data['id'] (assuming id was a column in the csv)
  • 1

    Importing csv directly into mysql is pain in ass but can be done.

    Just have to be exacting.

    There are plenty of examples on google.

    Since i use php most i gave reading csv in php then can use pdo to insert into db.
  • 1
    I think I've only imported from csv directly into mysql once and it was true pain in ass. Let me try to find what i used as guide.
  • 1
    @Bitwise I just created a php script that converted it to MySQL then imported that on production side

    So much time wasted lol, I can't believe not an easier way, either you have to drive into MySQL conf, or just lots of dead ends... Shoulda been Lot of easier.. And the PHP or Python way I tried in the beginning should have worked, just one of those things I guess
Your Job Suck?
Get a Better Job
Add Comment