2023-10-02

Handling of long living python objects in docker container, remote controll of objects

I am developing a design which works on top of a docker network topology. It emulates netwrok elements talking to each other using udp packets. On host the big controller script is executed. It brings up the network of containers with python base image, no other tools added, pip is out of the game. On the host the main script brings up a topology using docker network create and docker run. It comes up and ping is working.

Utilizing the benefits of folder sharing, configuration data is already copied to the nodes, they can access it, it is time to make something work in the containers.

The hello.py works already fine, there comes our component_1, who is in this flow the command line interpreter.
I will call it execute_me_remotely.py or execute.py to get shorter.

I can reach this file in the container by

docker exec execute.py

execute.py contains a code like this:

def parse_and_route_command(passed_params):
    print(f"Node Conrol Call: \n")
    command_string = ""
    parameter_list = []
    i = 0
    for param in passed_params:
        print(f"arg_{i}: {param}")
        parameter_list.append(param)

    command_string = parameter_list[3]
    if "create" == command_string:
        # create an object which will live for a long time
        element_core = object()
    if "start" == command_string:
        # perform a start() call on the object above
        element_core.start()
    if "stop" == command_string:
        element_core.stop()
    if "diag" == command_string:
        element_core.diag()


if __name__ == '__main__':
    parse_and_route_command(sys.argv)

The host, control script calls it like:

docker exec mycontainer execute.py create

# Some time passes

docker exec mycontainer execute.py start

element_core is an object which hosts a message loop and is a server application (emulated). element_core is the device logic container and I need a way to control it by docker exec calls in the container.

So far it works good but i abit confused about how to control using docker exec execute.py. I must create an object and let it live in the container so that later I would like to change its states. Start or stop the loop it runs.

Update: I had a good comment from david-maze refering to a database to recreate the object. Yes I use a databasse already to buld up this object and this database is more, it contains already other data as well. It is a memory which with json we can with a single call convert to dictionary and back. I can write into this file from the execute.py script which is called over the network. The received command is interpreted a json key value paar and the long living element core object reads and resets these fields in the json file.

Update: I am coming to a solution using the json file which is passed anyway to this container, element_core creation uses it to build up itself. The calls on this script sets a "command_received":f"{value_from_parameter}" item in the json.dict. The json file i can reload with no cost and can be used as a shared memory dictionary on the container. element_core has to read and reset this item in the json file in its loop. I move forward now with this, thanks for the inputs, i have learned a lot. If you think I should consider something or have better ideas, let me know.



No comments:

Post a Comment