-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Welcome to the JESSIE (Just Express Specifications, Synthesize, and Interact) wiki!
Here we provide an overview of the system, describe how to get started, and explain the structure of our files.
If you are interested in learning more, or if you use this system in your work, please cite and refer to [1].
For more details on LTLstack and its theoretical basis, please refer to [2].
[2] K. W. Wong and H. Kress-Gazit. From High-level Task Specification to Robot Operating System (ROS) Implementation. In 2017 First IEEE International Conference on Robotic Computing (IRC), pages 188-195. IEEE, 2017.
JESSIE is a robot system that enables clinicians to create customizable robot programs for social robots. The system consists of a tangible programming language that uses control synthesis methods such that clinicians, who may not have prior programming experience, can customize therapies for individuals with mild cognitive impairment (MCI). Robot behaviors abstract to high-level actions that may be easier to understand, especially for those who may not have experience with programming robots.
Jessie is comprised of three main components:
- ROS nodes
- LTLstack for controller synthesis
- Our card-based tangible specification interface
First, the user specifies the robot's activities and behaviors with our tangible specification interface. Then, a specification file is created which includes the desired sensor and actuator nodes, the robot's initial conditions, event ordering, and sensor-reaction maps. Finally, LTLstack synthesizes a controller to execute the associated ROS nodes. (See figure below).
We provide an implementation of this system for the Kuri and Turtlebot 2 robots, and should be readily adaptable for other ROS platform use.
Jessie can be implemented on any robot that uses ROS. Updated system requirements and dependencies for installing LTLstack can be found on the LTLstack wiki page.
Installation instructions for LTLstack and its dependency slugs are explained in the attached LTLstack Installation file.
The wiki for LTLstack and included LTLstack documentation file both contain a tutorial overview for starting out with LTLstack.
Our tangible language encompasses two kinds of cards: activities (blue) and sensors (orange). Activity cards represent the behavior and actions that Kuri can perform, and sensor cards represent the stimuli that Kuri should respond to.
Cards depicts the title and simple icon on the front, and a description of the card on the back.
To create a program, programmers may place the activity cards in any order, going from top-to-bottom, left-to-right. Sensor cards can be placed anywhere, as they run as a subroutine in parallel with the main activities. Programmers simply need to place the desired activity card below the sensor module, such that the arrow is pointing to it. Then, the sensor nodes will allow the robot to react to the associated stimuli throughout the program.
Each card also depicts a fiducial marker for automatic generation of specification files.
In the following example of a program made by a clinician, the robot starts off with a Greeting, Plays music, Gives word game instructions, then Plays the word game. If the person with MCI ends the game with a high score, the robot will congratulate them. The equivalent implementation in LTL is displayed to the right, and the associated slugs file (p6_anon.slugs) can be found under the mci_ltl folder under the Code_Materials directory.
A full list of cards and descriptions can be found in the Activity_Modules_Overview file.
We provide a script (qrcode_reader.py) to automatically translate from the tangible interface to LTL specifications. This script requires the argparse, pyzbar, and cv2 Python 3 libraries.
Once the program has been laid out, make sure the QR tag of each card is visible. Take a picture of the desired program and upload it to the robot (or wherever the qrcode_reader.py script is stored).
Then, programmers can run the script with the command
python3 qrcode_reader.py directory_to_program_image [output_directory]
Note that the output_directory
argument is optional. If unspecified, the specification will be saved to a new file called mci_ltl.slugs in the same directory as the Python script.
Programmers can extend the possible robot behaviors (activities and sensor modules) by implementing new ROS nodes. A proposition node can be written by following the tutorial on the ROS website, or by following step 5 in the LTLstack documentation file. Note that these nodes must be added to the YAML file, as described in the LTLstack wiki page.
Additionally, ROS nodes for propositions follow a particular pattern. Examples of complete nodes that we used for JESSIE can be found in the proposition_nodes folder under the Code_Materials directory. Below, we explain each part of the file, using the Whistle node (whistle.py) as we implemented in Python 2.
The LTLStack class contains a callback method which is called when the node receives data from the associated ROS topic, as well as a data member to store the most recent value received.
class ltlStack(object):
def __init__(self):
self.request_data = False
def callback(self, data):
self.request_data = data.data
First, initiate the ROS node and set up publishers and subscribers to the ROS topics.
if __name__ == '__main__':
rospy.init_node('whistle')
rate = rospy.Rate(10)
ltl = ltlStack() # LTLstack object for controller
complete = ltlStack() # LTLstack object for associated complete node
# subscribe to ROS topics
rospy.Subscriber('/mci_ltl/outputs/whistle', Bool, callback=ltl.callback)
rospy.Subscriber('/mci_ltl/inputs/whistleComplete', Bool, callback=complete.callback)
# publish to ROS topics
anim_pub = rospy.Publisher('/command', Command, queue_size=1)
done_pub = rospy.Publisher('/mci_ltl/whistleSignalComplete', Bool, queue_size=1)
rospy.sleep(0.5)
The desired robot behavior is specified under the while loop and will be executed when the LTLstack-generated controller sends a message to do so.
while not rospy.is_shutdown():
if ltl.request_data:
rospy.loginfo("About to start whistling")
publish_animation(anim_pub, "reset_head")
rospy.sleep(0.5)
publish_animation(anim_pub, "wolf_whistle")
rospy.sleep(0.5)
# Signal that the event is complete
while not complete.request_data:
done_pub.publish(True)
rospy.sleep(3)
rate.sleep()
To create a new LTLstack package, follow steps 6 and 7 in the LTLstack documentation file. To run it, follow step 8. Programmers can write slugs specification files to specify the order of execution of activity nodes as well as map sensor nodes to activities. For more information on writing slugs files, see the attached Slugs_Extensible_GR(1)_Synthesis file. Full examples of slugs files can be found in the mci_ltl folder under the Code_Materials directory. Below, we explain the parts of the structured slugs file that the programmer can change to alter the robot's behavior.
In order to change the order of activity nodes as well as the robot's reactions, programmers can add propositions under the [SYS_TRANS] section. Below is the aforementioned example written out in structured slugs.
# Ordering constraints
!greetingComplete -> !playMusic'
!playMusicComplete -> !wordGameInstructions'
!wordGameInstructionsComplete -> !wordGame'
# Reaction constraints
(scoreHigh' & !congratulateComplete') -> congratulate'
Finally, specify the final node to run under the [SYS_LIVENESS] section.
# Finish the word game
wordGameComplete