1
votes

My test script simply does a perl dbi connection to a mysql database and given a list of tables, extracts (1) record per table.

For every table I list, I also want to print that (1) record out to its own file. For example if I have a list of 100 tables, I should expect 100 uniques files with (1) record each.

So far the code works, but I am interested in creating a sub routine, call it create_file for pieces of the code that handles that #Create file

I am not familiar with writing sub routines and need help implementing that if possible. I am not sure how I would call the part where the data is built. $data='';

Can someone show me good way to do this? Thanks for your help.

code:

# Get list of tables
my @tblist = qx(mysql -u foo-bar -ppassw0rd --database $dbsrc -h $node --port 3306 -ss -e "show tables");

# Data output
    foreach my $tblist (@tblist)
    {
       my $data = '';
       chomp $tblist;

       #Create file
       my $out_file = "/home/$node-$tblist.$dt.dat";
       open (my $out_fh, '>', $out_file) or die "cannot create $out_file: $!";

       my $dbh = DBI->connect("DBI:mysql:database=$dbsrc;host=$node;port=3306",'foo-bar','passw0rd');
       my $sth = $dbh->prepare("SELECT UUID(), '$node', ab, cd, ef, gh, hi FROM $tblist limit 1");
       $sth->execute();
             while (my($id, $nd,$ab,$cd,$ef,$gh,$hi) = $sth->fetchrow_array() ) {
             $data = $data. "__pk__^A$id^E1^A$nd^E2^A$ab^E3^A$cd^E4^A$ef^E5^A$gh^E6^A$hi^E7^D";
             }
             $sth->finish;
             $dbh->disconnect;

       #Create file
       print $out_fh $data;
       close $out_fh or die "Failed to close file: $!";
    };
2

2 Answers

2
votes
my $dt = "2011-02-25";
my $dbsrc = "...";
my $node = "...";

# Get list of tables
my @tblist = qx(mysql -u foo-bar -ppassw0rd --database $dbsrc -h $node --port 3306 -ss -e "show tables");
my $dbh = DBI->connect("DBI:mysql:database=$dbsrc;host=$node;port=3306",'foo-bar','passw0rd');
foreach my $tblist (@tblist)
{
   # This breaks - chomp is given a list-context
   #extract_data($dbh, chomp($tblist));
   chomp $tblist;
   extract_data($dbh, $tblist);
};
$dbh->disconnect;

sub extract_table
{
     my($dbh, $tblist) = @_;

     my $out_file = "/home/$node-$tblist.$dt.dat";
     open (my $out_fh, '>', $out_file) or die "cannot create $out_file: $!";

     my $sth = $dbh->prepare("SELECT UUID(), '$node', ab, cd, ef, gh, hi FROM $tblist limit 1");
     $sth->execute();
     while (my($id, $nd,$ab,$cd,$ef,$gh,$hi) = $sth->fetchrow_array() ) {
         print $out_fh "__pk__^A$id^E1^A$nd^E2^A$ab^E3^A$cd^E4^A$ef^E5^A$gh^E6^A$hi^E7^D";
     }
     $sth->finish;

     close $out_fh or die "Failed to close file: $!";
};

Unless you really need to connect to the database for each statement you execute, keep the database open between operations.

2
votes

Instead of creating your own create_file, you could use write_file from File::Slurp. It abstracts away the open/close/die/print.