Skip to content
Snippets Groups Projects
Tommy Persson's avatar
d
Tommy Persson authored
8eeef6ca
History

lrs_doc

Documentation for ROS2 version of LRS (Linköping Robotic System).

LRS is a collection of programs used for research at Linköping University. This page gives an introduction to Task Specification Trees (TST) and shows how to integrate a robot using ROS2 in the LRS system so that it can execute a given TST.

For now this is mostly shown by example code in Python using rclpy and instructions of how to run the example system.

1. Background

1.1 Task Specification Tree (TST)

1.1.1 Introduction

Task Specification Trees (TSTs) provide a way of specifying tasks and missions to be executed by a team of collaborating agents. Each TST consists of nodes that declaratively specify tasks to perform, where:

  • Inner tree nodes specify standardized control structures such as sequences (S), concurrency (C), conditionals (IF) and loops (WHILE), which are directly supported by the framework. For example, a sequence node ensures that its children are executed in a strict sequence, while a concurrency node allows the independent tasks represented by its children to be executed concurrently, if there are sufficient platforms and resources to do so.

  • Leaf nodes specify potentially domain-specific tasks to be executed, such as flying to a location or scanning an area for people. Each node is associated with a set of named parameters, such as the area that should be scanned.

These specifications are completely declarative: They specify a task to be performed, together with a set of parameters, and the associated documentation specifies how the task and its parameters should be interpreted. A TST does not in itself provide any information about how to actually perform a task on a particular platform, because this is necessarily platform-specific.

The following is an example TST shown in a graphical representation, showing the general tree structure and the type of each node but omitting many details such as node parameters and constraints:

For each node the execution agent (or executuion unit or execution namespace) is specified. We use ROS2 namespaces for this. If a node says "execunit" is "/ex0" than the corresponding task should be executed on a agent with ROS2 namespace "/ex0". An agent can also be specified by a symbol like "A" in the image above and then we can specify constraints on "A". Notice also that in a tree we can specify different agents for different nodes. In that case the tree is distributed to all agents that participates in the execution of the tasks in the tree and the tasks are executed on the right agent.

During exexution of a tree we distribute information about the the execution so we can execute thing in the right order.

Here is documentaion (generated from specification) of the different node types available:

1.1.2 JSON Specification

JSON is used to specify a TST. Here is an example of a sequence node with one child which is a fly-to:

{
    "children": [
        {
            "children": [],
            "common_params": {
                "execunit": "/ex0",
                "use_lock": false
            },
            "name": "fly-to",
            "params": {
                "commanded-speed": 7.0,
                "do-not-yaw-flag": false,
                "p": {
                    "altitude": 39.8,
                    "latitude": 57.76080010943341,
                    "longitude": 16.6828402386254,
                    "rostype": "GeoPoint"
                }
            }
        }
    ],
    "common_params": {
        "execunit": "/ex0",
        "use_lock": false
    },
    "name": "seq",
    "params": {}
}

The intendend meaning of different parts are:

  • name: The name of the node type that are specified.
  • common_params: Parameters common to all node types.
  • params: Parameters that are specific to this node.
  • tst-params: Here the same meaing as params
  • task-params: Here the same meaning as params

We define the TST node types in the system using JSON files. These files was used to generate the documentation above.

For examples see: https://gitlab.liu.se/lrs2/lrs_exec/-/tree/main/config

Here is how the node type move-to is defined:

{
    "type": "tst node",    
    "desc": "This node moves the agent to a specific position. It is allowed to prepare for moving inside the node, for example doing take-off before flying. Also the specified altitude is the minimal altitude, the flying should find postiions higher if the position is occupied with obstacles.",
    "key": "move-to",
    "status": "released",
    "tags": [
        "vehicle"
    ],
    "tst-params": [
    ],
    "task-params": [
        {
            "desc": "The position to move to",
            "key": "waypoint",
            "type": "geopoint"
        },
        {
            "desc": "Qualitative speed level.  Possible values are 'fast', 'standard' and 'slow', and the meaning of these parameters is platform-dependent.",
            "key": "speed",
            "type": "string"
        }
    ],
    "node-type": "terminal",
    "required-params-for-execution": [],
    "node-resources": []
    
}

1.1.3 Example: Sequence of three fly-to

Example of three different agents (/ex0, /ex1, /ex2) flying in a sequence:

{
    "children": [
        {
            "children": [
                {
                    "children": [],
                    "common_params": {
                        "execunit": "/ex0",
                        "use_lock": false
                    },
                    "name": "fly-to",
                    "params": {
                        "commanded-speed": 7.0,
                        "do-not-yaw-flag": false,
                        "p": {
                            "altitude": 39.8,
                            "latitude": 57.76080010943341,
                            "longitude": 16.6828402386254,
                            "rostype": "GeoPoint"
                        }
                    }
                },
                {
                    "children": [],
                    "common_params": {
                        "execunit": "/ex1",
                        "use_lock": false
                    },
                    "name": "fly-to",
                    "params": {
                        "commanded-speed": 7.0,
                        "do-not-yaw-flag": false,
                        "p": {
                            "altitude": 39.8,
                            "latitude": 57.76080010943341,
                            "longitude": 16.6828402386254,
                            "rostype": "GeoPoint"
                        }
                    }
                },
                {
                    "children": [],
                    "common_params": {
                        "execunit": "/ex2",
                        "use_lock": false
                    },
                    "name": "fly-to",
                    "params": {
                        "commanded-speed": 7.0,
                        "do-not-yaw-flag": false,
                        "p": {
                            "altitude": 39.8,
                            "latitude": 57.76080010943341,
                            "longitude": 16.6828402386254,
                            "rostype": "GeoPoint"
                        }
                    }
                }
            ],
            "common_params": {
                "execunit": "/ex0",
                "use_lock": false
            },
            "name": "conc",
            "params": {}
        }
    ],
    "common_params": {
        "execunit": "/ex0",
        "use_lock": false
    },
    "name": "seq",
    "params": {}
}

1.2 Executors

For a robot/agent the executors for all node types specifying tasks that are robot specific must be implemented. For example node types like fly-to, move-to, look-at-position and so on are robot/agent specific. Common types are things like seq (S), conc (S), wait, send-signal and so on and are implemented in the system (in lrs_exec).

Here is how the noop executor is implemented in Python:

import time
import rclpy
from rclpy.executors import MultiThreadedExecutor

from lrs_exec.ticked_executor import TickedExecutor
from lrs_exec.ticked_executor_factory import TickedExecutorFactory

class NoopExecutor(TickedExecutor):
    def __init__(self, node, id):
        super().__init__(node, id)
        
    def do_work(self):
        return

class NoopExecutorFactory(TickedExecutorFactory):
    def __init__(self):
        super().__init__("no-op")

    def get_executor(self, node, id):
        return self.add_executor(NoopExecutor(node, id))

def main(args=None):
    rclpy.init(args=args)
    
    executor = MultiThreadedExecutor(num_threads=8)    
    
    node = NoopExecutorFactory()
    
    executor.add_node(node)    
    print(f"Spinning {__name__}")
    try:
        executor.spin()
    except KeyboardInterrupt:
        pass

    executor.shutdown()
    rclpy.shutdown()

if __name__ == '__main__':
    main()

An executor factory can be started in different ways. For starting each factories from a laucnh file the main function is needed.

1.3 Node Expansion

An executor can be of a type were the node is expanded before execution of the node. The expansion must be of a type that assigns execution units to all the new nodes created in the expansion and the expansion must be able to be done before we start the execution of the whole TST tree. Also if the node to be expanded is on unit X then all non-terminal noded need to have X as execution unit.

There is an tree execution server on all agents that takes a service call with the whole JSON tree to expand. The first step in this execution is to expand the terminal node that should be executed on the agent running the execution tree. The actual expansion is done in the ExecutorFactory class for a node type.

Example of how to implement an expandable executor for in-air-goal:


from lrs_exec.executor_factory import ExecutorFactory

from lrs_util.jsonutil import *

class InAirGoalExecutorFactory(ExecutorFactory):
    def __init__(self):
        super().__init__("in-air-goal", expandable=True)
        
    def json_expand(self, node_json):
        res = ""
        if not self.expandable:
            return res
        try:
            jobj = json.loads(node_json)
            name = get_name(jobj)
                    
            self.get_logger().info(f"Before expansion: {json.dumps(jobj, sort_keys=True, indent=4, separators=(',', ': '))}")
            exec_unit = get_exec_unit(jobj)
            self.get_logger().info(f'expand_in_air_goal EXECUTION UNIT: {exec_unit}')
        
            expobj = json_basic_test_if(exec_unit)
            expobj["children"].append(json_basic_in_air_test(exec_unit))
            expobj["children"].append(json_basic_noop(exec_unit))
            expobj["children"].append(json_basic_take_off(exec_unit))
            expobj["children"].append(json_basic_noop(exec_unit))
            res = json.dumps(expobj, sort_keys=True, indent=4, separators=(',', ': '))
        except Exception as exc:
            self.get_logger().error(f'EXCEPTION start: {type(exc)} - {exc}')            
        return res

1.5 Time

It is also possible to specify time information in a node. The following time parameters are available:

  • stime_lb : Start time lower bound
  • stime_ub : Start time upper bound
  • etime_lb : End time lower bound
  • etime_ub : End time upper bound
  • duration_lb : Duration lower bound
  • duration_ub : Durayion upper bound

There are also some other parameters available for handling time:

  • wait_for_stime : Boolean. If true in the execution we wait for the stime_lb before we start the execution.
  • wait_for_etime : Boolean: If true we wait for etime_lb before finnishing the execution.
  • tree_start_time : If set the specified times are time from the tree_start_time. Otherwise the specified times are abolute.
  • node_start_time : The actual time the node execution was started.

Default is that wait_for_stime and wait_for_etime are true if not set to false. If wait_for_stime is not set than we will fail if we try to start a node before wait_for_stime. If wait_for_etime is not set then we will fail if we finish before etime_lb.

If we cannot start before stime_ub we will fail. If the execution finish after etime_ub we will also fail the node.

The parameters above can be set in the common_params block. If a parameter is not set then lb times are assumed to be at "0". And ub times are assumed to be "infinity" or the largest allowed time. In practise the largest allowed time assuming integer time is usually set to 3600 seconds but that is just to make the constraint solving more efficient.

Issues:

  • Do we need to be able to set an explicit end time or is it enought to set etime_lb = etime_ub = ETIME?

The functionality above is not yet fully implemented.

1.6 Constraints

We can add a set of constraints to each node. If we use timed execution we can for example add a constraint like:

  • etime < 600 : With the intended meaing that the end time for the node should be less then 600 seconds after the tree start time.

The following constraints are automatically added for all nodes if times are used.

  • stime_lb < stime_ub
  • etime_lb < etime_ub
  • duration_lb < duration_ub
  • stime_ub + duration_ub < etime_ub
  • stime_lb + duration_lb < etime_lb

If the automatically generated constraints have a solution we can execute the tree. We can also add additional constraints like:

  • etime_ub < 600 and check that we still have a solution. It can also be used to get a solution with specific values for parameters that can then be added to the tree.

To be extended...

Not yet fully implemented.

1.7 Signals

To be written...

1.8 Communication

See: https://gitlab.liu.se/lrs2/lrs_doc/-/blob/main/communication.md

2. Test Examples

2.1 Preparations

Install ROS2 humble (it will probably work in galactic also). See:

https://docs.ros.org/en/humble/Installation/Ubuntu-Install-Debians.html

In your ROS2 workspace you need to install the following packages:

git clone https://github.com/teamspatzenhirn/spatz_interfaces.git
git clone https://github.com/teamspatzenhirn/rviz_birdeye_display.git
git clone https://gitlab.liu.se/lrs2/lrs_msgs_common.git
git clone https://gitlab.liu.se/lrs2/lrs_msgs_tst.git
git clone https://gitlab.liu.se/lrs2/lrs_srvs_exec.git
git clone https://gitlab.liu.se/lrs2/lrs_srvs_tst.git
git clone https://gitlab.liu.se/lrs2/lrs_srvs_ra.git
git clone https://gitlab.liu.se/lrs2/lrs_srvs_wdb.git
git clone https://gitlab.liu.se/lrs2/lrs_exec.git
git clone https://gitlab.liu.se/lrs2/lrs_util.git
git clone https://gitlab.liu.se/lrs2/lrs_wdb.git
git clone https://gitlab.liu.se/lrs2/lrs_turtle.git
git clone https://gitlab.liu.se/lrs2/lrs_ardupilot.git

Make sure that the following is installed:

sudo apt install ros-humble-turtlesim ros-humble-geograhic-msgs ros-gazebo-msgs python3-colcon-ros python3-colcon-bash

2.2 Simple Example

2.2.1 Start Core System

Do the following:

ros2 launch lrs_exec exec.launch.py ns:=/ex0

This will start the following program/nodes:

  • A resource node that keeps tracks of locking.
  • TST Exec Factory: A factory that creates and stores the TST trees used for execution.
  • Basic Executors: Starts executor factories for: seq, conc, wait, noop, test_if, supp and send_signal.
  • Tree executor: The programs that handles the execution of a TST tree.
  • A world database and coordinate conversion functionality.

2.2.2 Start TST display server (ROS2)

The created TST Trees and how they are executed can be visualized as described below. Only the latest executing tree can be visualized. The color meaning is:

  • green: we are executing.
  • red: execution failed for some reason.
  • blue: execution was aborted by sending an $abort signal to the executor.
  • yellow: execution succeeded.
  • gray: We finished a child node under a supp node.

Start the TST image generator

ros2 launch lrs_exec tst_streamer.launch.py ns:=/ex0

This will generate images on the topic: /ex0/tststream/image_raw/compressed

Image viewer:

ros2 run rqt_image_view rqt_image_view

2.2.3 Start Example TST Execution

ros2 run lrs_exec tst_command --ros-args -r __ns:=/ex0 -p command:=test-seq-1

This will construct a tree im the TST factory and start execution of it. The tree as JSON is:

{
    "children": [
        {
            "children": [],
            "common_params": {
                "execunit": "/ex0",
                "use_lock": false
            },
            "name": "wait",
            "params": {
                "duration": 10
            }
        },
        {
            "children": [],
            "common_params": {
                "execunit": "/ex0",
                "use_lock": false
            },
            "name": "wait",
            "params": {
                "duration": 10
            }
        },
        {
            "children": [],
            "common_params": {
                "execunit": "/ex0",
                "use_lock": false
            },
            "name": "wait",
            "params": {
                "duration": 10
            }
        }
    ],
    "common_params": {
        "execunit": "/ex0",
        "use_lock": false
    },
    "name": "seq",
    "params": {}
}

Look at the example program for how to create and execute TST trees.

2.3 Turtle example ROS2 and Task Specification Trees

Start the TST streamer and viewer as above.

2.3.1 One turtle

2.3.1.1 Start Turtle Sim

ros2 run turtlesim turtlesim_node --ros-args -r __ns:=/ex0

or start a python implementation without graphics and unlimited area to move in:

ros2 launch lrs_turtle pyturtle.launch.py ns:=/ex0

2.3.1.2 Start TST Executors for /ex0 turtlesim and rest of the system

ros2 launch lrs_turtle turtle.launch.py ns:=/ex0

That implements executor for move-to. Notice that the valid area is bottom left (0.5, 0.5) and top right (10.0, 10.0). Trying to move to or across that border will make the controller to not work for future commans. For now use teleport to get out of the border.

2.3.1.3 Test with move-to

ros2 run lrs_exec tst_command --ros-args -r __ns:=/ex0 -p command:=move -p x:=7.0 -p y:=8.0 -p sspeed:="standard" -p unit:=/ex0
ros2 run lrs_exec tst_command --ros-args -r __ns:=/ex0 -p command:=test-turtle-1 -p unit:=/ex0

If stuck teleport:

ros2 service call /ex0/turtle1/teleport_absolute turtlesim/srv/TeleportAbsolute "{x: 1.0, y: 1.0, theta: 0.0}"

Currently only move-to and move-path is implemented for the turtlesim.

2.3.2 Two turtles

Implementation fix needed...

2.4

For how to test the ardupilot things with a copter see: https://gitlab.liu.se/lrs2/lrs_ardupilot/-/blob/main/README.md

Additional Information

Many additional properties of TST are described here: