[Bioperl-l] Remote Blast error - 500 Too many open files
s_waechter at gmx.net
Sun Dec 5 16:39:33 EST 2004
I get the same error like Affan Qureshi a couple of weeks ago.
I wrote a script, that send a couple of hundreds sequences in portions
of one hundred to NCBI Blast Server using the RemoteBlast module. After
a while I the script breaks with an error like that below.
For me it seems that the remote blast module stores the HTML output from
NCBI in this cryptic filenames in the tmp directory. First the page with
the RID is stored followed by the complete result page from NCBI . All
these tmp-files are deleted, when the script is finished. The problem
seems to be, that these files are accumulated when the script is
running. I assume that there is a limit in how many files could be
stored in a unix/linux directory. Another questions : Why are these
files still open ? In the moment I havn't found a solution for that
Did someone know how to manage this problem ?
Is there a possibility to tell the RemoteBlast module to delete these
tmp files immediately when they are no longer required ?
Thanks for your help
Affan Qureshi wrote:
>I tried a remote blastx search today around 1:00pm and got this error
>message after a long wait. Also it seemed that the NCBI web interface was
>taking too long for BLAST searches.
><HEAD><TITLE>An Error Occurred</TITLE></HEAD>
><H1>An Error Occurred</H1>
>500 Cannot write to '/tmp/hLFqVvHO2D': Too many open files
>Is this a remote server error or am I doing something wrong? Anyone else
>got this error?
>Bioperl-l mailing list
>Bioperl-l at portal.open-bio.org
More information about the Bioperl-l