mortoray/shelljob

Running multiple commands with oneline input

Closed this issue · 5 comments

Hello

I want to run commands sequentially.

Example 1: p1 and p2 works parallel. So it is useless for me.
`from shelljob import proc
import sys

g = proc.Group()
p1 = g.run( [ 'ls', '-al', '/usr/local' ] )
p2 = g.run( [ 'echo', '123' ] )

while g.is_pending():
lines = g.readlines()
for proc, line in lines:
sys.stdout.write( "{}:{}".format( proc.pid, line ) )
`
Example 2: I tried to use p1 command by injecting ';' char but it didn't work.
p1 = [ 'ls', '-al', '/usr/local;' 'echo 123' ]

How can I solve it?

You can use the Monitor class instead, which is a higher-level wrapper to Group. IT creates a mini-job layer that runs a certain number in parallel -- set max_simul = 1 one construction to run only one at once.

You derive from Monitor to use it, then implement the job_output function to print to the screen, if you want.

The run function takes a list of commands and runs them all, waiting for them to complete.

If share an example code block about Monitor class usage, when you are available, I will be glad @mortoray

Look at how FileMonitor is implemented.

https://github.com/mortoray/shelljob/blob/master/src/shelljob/job.py#L181

It creates a monitor which outputs the results of several commands to log-files. The test_job.py test uses that class.

Unfortunately I couldn't implement similar usage for my code. I am creating a stream by using this code.

@app.route( '/stream')
def stream():
    mycmd=[]
    cwd=os.getcwd()
    repourl=request.args['repourl']
    os.chdir(repourl)    
    for cmd in request.args['commands'].split(' '):
        mycmd.append(cmd)
    #return str(mycmd)
    g = proc.Group()
    p = g.run( mycmd )


    def read_process():
        while g.is_pending():
            lines = g.readlines()
            for proc, line in lines:
                yield "data: " +str(line) +"\n\n"
    os.chdir(cwd)
    return Response( read_process(), mimetype= 'text/event-stream' )

You're better off here creating a new Group for each command and capturing that way in sequence. There's no benefit to having them all in the same Group, that's meant only for parallel execution, which you don't want. That is, inside read_process iterate through each command, create the group, and read from it. Repeat for each command.

I've done this on a previous project, and I believe it's where Group evolved from -- the capturing was the important part, not the parallel execution.