Python to emulate remote tail -f?

前端 未结 6 1472
攒了一身酷
攒了一身酷 2021-01-04 20:28

We have several application servers, and a central monitoring server.

We are currently running ssh with \"tail -f\" from the monitoring server to stream several text

相关标签:
6条回答
  • 2021-01-04 20:46

    The paramiko module supports connecting with via ssh with python.

    http://www.lag.net/paramiko/

    The pysftp has some examples of using it and the execute command method might be what your looking for. It will create a file like object of the command you execute. I can't say if it gives you live data though.

    http://code.google.com/p/pysftp/

    0 讨论(0)
  • 2021-01-04 20:46

    I know this doesn't answer your questions, but...

    Maybe you could try using screen. If your session drops, you can always reattach and the tail will still be running. It also supports multiuser, so 2 users can view the same tail command.

    http://en.wikipedia.org/wiki/GNU_Screen

    create with the name "log":

    screen -S log
    

    disconnect:

    [CTRL]+A D
    

    reattach

    screen -r log
    

    list when you can remember the name

    screen -list
    

    To get rid of the session, just type exit while in it.

    0 讨论(0)
  • 2021-01-04 20:46

    I think the screen idea is the best idea, but if you're not wanting to ssh and you want a python script to do it. Here is a simple pythonic XMLRPC way of getting the info. It will only update when something has been appended to the file in question.

    This is the client file. You tell this which file you want to read from and what computer its on.

    #!/usr/bin/python
    # This should be run on the computer you want to output the files
    # You must pass a filename and a location
    # filename must be the full path from the root directory, or relative path
    # from the directory the server is running
    # location must be in the form of http://location:port (i.e. http:localhost:8000)
    
    import xmlrpclib, time, sys, os
    
    def tail(filename, location):
       # connect to server
       s = xmlrpclib.ServerProxy(location)
    
       # get starting length of file
       curSeek = s.GetSize(filename)
    
       # constantly check
       while 1:
          time.sleep(1) # make sure to sleep
    
          # get a new length of file and check for changes
          prevSeek = curSeek
    
          # some times it fails if the file is being writter to,
          # we'll wait another second for it to finish
          try:
             curSeek = s.GetSize(filename)
          except:
             pass
    
          # if file length has changed print it
          if prevSeek != curSeek:
             print s.tail(filename, prevSeek),
    
    
    def main():
       # check that we got a file passed to us
       if len(sys.argv) != 3 or not os.path.isfile(sys.argv[1]):
          print 'Must give a valid filename.'
          return
    
       # run tail function
       tail(sys.argv[1], sys.argv[2])
    
    main()
    

    This is the server you will run this on each computer that has a file you want to look at. Its nothing fancy. You can daemonize it if you want. You just run it, and you client should connect to it if you tell the client where it is and you have the right ports open.

    #!/usr/bin/python
    # This runs on the computer(s) you want to read the file from
    # Make sure to change out the HOST and PORT variables
    HOST = 'localhost'
    PORT = 8000
    
    from SimpleXMLRPCServer import SimpleXMLRPCServer
    from SimpleXMLRPCServer import SimpleXMLRPCRequestHandler
    
    import time, os
    
    def GetSize(filename):
       # get file size
       return os.stat(filename)[6]
    
    def tail(filename, seek):
       #Set the filename and open the file
       f = open(filename,'r')
    
       #Find the size of the file and move to the end
       f.seek(seek)
       return f.read()
    
    def CreateServer():
       # Create server
       server = SimpleXMLRPCServer((HOST, PORT),
                                   requestHandler=SimpleXMLRPCRequestHandler)
    
    # register functions
       server.register_function(tail, 'tail')
       server.register_function(GetSize, 'GetSize')
    
       # Run the server's main loop
       server.serve_forever()
    
    # start server
    CreateServer()
    

    Ideally you run the server once, then from the client run "python client.py sample.log http://somehost:8000" and it should start going. Hope that helps.

    0 讨论(0)
  • 2021-01-04 20:47

    I wrote a function that do that:

    import paramiko
    import time
    import json
    
    DEFAULT_MACHINE_USERNAME="USERNAME"
    DEFAULT_KEY_PATH="DEFAULT_KEY_PATH"
    
    def ssh_connect(machine, username=DEFAULT_MACHINE_USERNAME,
                    key_filename=DEFAULT_KEY_PATH):
        ssh = paramiko.SSHClient()
        ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
        ssh.connect(hostname=machine, username=username, key_filename=key_filename)
        return ssh
    
    def tail_remote_file(hostname, filepath, key_path=DEFAULT_KEY_PATH,
                         close_env_variable="CLOSE_TAIL_F", env_file='~/.profile'):
        ssh = ssh_connect(hostname, key_filename=key_path)
    
        def set_env_variable(to_value):
            to_value_str = "true" if to_value else "false"
            from_value_str = "false" if to_value else "true"
            ssh.exec_command('sed -i \'s/export %s=%s/export %s=%s/g\' %s' %
                             (close_env_variable, from_value_str,
                              close_env_variable, to_value_str, env_file))
            time.sleep(1)
    
        def get_env_variable():
            command = "source .profile; echo $%s" % close_env_variable
            stdin, stdout_i, stderr = ssh.exec_command(command)
            print(command)
            out = stdout_i.read().replace('\n', '')
            return out
    
        def get_last_line_number(lines_i, line_num):
            return int(lines_i[-1].split('\t')[0]) + 1 if lines_i else line_num
    
        def execute_command(line_num):
            command = "cat -n %s | tail --lines=+%d" % (filepath, line_num)
            stdin, stdout_i, stderr = ssh.exec_command(command)
            stderr = stderr.read()
            if stderr:
                print(stderr)
            return stdout_i.readlines()
    
        stdout = get_env_variable()
        if not stdout:
            ssh.exec_command("echo 'export %s=false' >> %s" %
                             (close_env_variable, env_file))
        else:
            ssh.exec_command(
                'sed -i \'s/export %s=true/export %s=false/g\' %s' %
                (close_env_variable, close_env_variable, env_file))
        set_env_variable(False)
    
        lines = execute_command(0)
        last_line_num = get_last_line_number(lines, 0)
    
        while not json.loads(get_env_variable()):
            for l in lines:
                print('\t'.join(t.replace('\n', '') for t in l.split('\t')[1:]))
            last_line_num = get_last_line_number(lines, last_line_num)
            lines = execute_command(last_line_num)
            time.sleep(1)
    
        ssh.close()
    
    0 讨论(0)
  • 2021-01-04 20:55

    I've written a library that allows you to do just this - check out the "remote" feature of PimpedSubprocess (on github) or PimpedSubprocess (on PyPI)

    0 讨论(0)
  • 2021-01-04 21:07

    I've posted a question on something like this with code (paramiko)

    tail -f over ssh with Paramiko has an increasing delay

    0 讨论(0)
提交回复
热议问题