[Bioperl-l] parallel processing with perl

Peter Wilkinson pwilkinson at videotron.ca
Fri Jul 16 09:44:34 EDT 2004


This is something that you can do by creating a process for each query and 
capturing the output from each child process. Start by having a look at the 
fork() command. Be warned that process management is an art and you might 
end up spend some time learning about multiple process management.

Peter


At 04:27 AM 7/16/2004, gowthaman ramasamy wrote:
>Hello list,
>Please ignore this question if it sounds like not related to BIOPERL.
>
>  have a lengthy Perl script running on a 4 processor machine. At one
>point of time i have to execute four mysql quries from
>four different batch files (via shell). Currently i run them one after
>other. Can i some how fire them simultaneously so that they occupy all 4
>processors and does the job quickly.
>
>portion of script follows ...
>#!/usr/bin/perl
>.............
>...........
>$var1=`mysql -h localhost -u xx -pyy filter < batchfile1.sql |tail -1`;
>$var2=`mysql -h localhost -u xx -pyy filter < batchfile2.sql |tail -1`;
>$var3=`mysql -h localhost -u xx -pyy filter < batchfile3.sql |tail -1`;
>$var4=`mysql -h localhost -u xx -pyy filter < batchfile4.sql |tail -1`;
>
>$total=$var1+$var2+$var3+$var4;
>
>NOTE : I dont want to use Perl-DBI.
>many thankx in advance
>
>
>
>--
>Ra. Gowthaman,
>Graduate Student,
>Bioinformatics Lab,
>Malaria Research Group,
>ICGEB , New Delhi.
>INDIA
>
>Phone: 91-9811261804
>        91-11-26173184; 91-11-26189360 #extn 314
>
>_______________________________________________
>Bioperl-l mailing list
>Bioperl-l at portal.open-bio.org
>http://portal.open-bio.org/mailman/listinfo/bioperl-l



More information about the Bioperl-l mailing list