Aegidius
 Plüss   Aplulogo
 
in examples
Use the asterisk (*) character for wildcard searches
Searches on more than one word will be treated "as a phrase"
 
serverstatus
 www.aplu.ch
 www.java-online.eu
     Print Text 
© 2018, V10.0
 
  Jython
 
 

 

RobotSim

A OOP based Simulation Package for the Lego NXT/EV3 Robot

(JavaDoc)

Robot applications are usually executed software based before being tested on the robot itself. This not only saves time, but also cost. The software simulator should be designed that no or only few modifications of the simulation program is necessary to run it on the real machine. The standard Lego NXT/EV3 robot is a moving robot for the educational environment using a Lego brick microcontroller, fixed on a gear with two motors. It can be equipped with several sensors, like touch sensors, light sensors, etc. The microcontroller can be programmed in different languages, among them Java and Python. Wrapper libraries NxtJLib and EV3JLib are available that are fully object oriented.

In the class room a robot simulator may be useful not only for demonstration of the robot application development but also if there are not enough NXT/EV3 bricks available for all students. The simulation packages RobotSim presented here is based on the JGameGrid game development framework and the source code of the package is made public to be studied and modified by the users. The design follows closely the NxtJLib package for the NXT brick in remote control mode and the EV3JLib/EV3LibA packages for the EV3 in remote control and autonomous mode.

The core class LegoRobot has a very clean and simple public interface but does most of the work using a private inner class Nxt that extends Actor. As in NxtJLib/EV3JLib, touch and light sensors are derived from the abstract class Part that extends Actor too. The robot's playground is a GameGrid instance with one-pixel resolution which is a private instance variable of LegoRobot. This design hides most of the underlying JGameGrid package and, for basic simulations, the student does not need any knowledge of JGameGrid. To fully understand the possibilities of the RobotSim, take a look at the applications displayed below. The first one is the most elementary application which simply creates a robot with two motors moving it by rotating the motors forward and backward.

# SimEx01.py
# Motors

from simrobot import *                         
  
robot = LegoRobot()
motA = Motor(MotorPort.A)
motB = Motor(MotorPort.B)
robot.addPart(motA)
robot.addPart(motB)

motA.forward()
motB.forward()
Tools.delay(2000)

motA.stop()
Tools.delay(2000)

motB.stop()
Tools.delay(2000)

motA.backward()
motB.forward()
Tools.delay(2000)

motB.backward()
Tools.delay(2000)

robot.exit()
Jytut17
Select SimEx01.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

In the next example we use a Gear instance instead of two motors. This simplifies the program considerably. The movement commands forward(), left(), leftArc() etc. have two versions, one returns immediately while the movement continues, one with a time parameter that blocks until the end of the time interval and stops the robot movement.

# SimEx02.py
# Gear
from simrobot import *                         
  
robot = LegoRobot()
gear = Gear()
robot.addPart(gear)

gear.forward()
Tools.delay(2000)
gear.left(2000)
gear.forward(2000)
gear.leftArc(0.2, 2000)
gear.forward(2000)
gear.leftArc(-0.2, 2000)
robot.exit()
Jytut18
Select SimEx02.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

Using a Logo-like turtle robot from the class TurtleRobot simplifies the program even more. No time parameter is needed to produce a given rotation angle. Keep in mind thatx due to rounding errors the angle has some inaccuracy that shows up in a long lasting movement.

# SimEx03.py
# TurtleRobot
from simrobot import *                         

robot = TurtleRobot()
for i in range(4):
  robot.forward(100)
  robot.left(90)
robot.exit();
Jytut19
Select SimEx03.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

Now we add sensors to the robot. These are modeled by subclasses of the Part class and must be "added" to the robot instance using the addPart() method. The sensor constructor takes a SensorPort reference that indicates where the sensor cable is plugged-in. In RobotSim it defines where the sensor is shown: SensorPort.S1 is shown at the robot front right side, SensorPort.S2 at the left side and SensorPort.S3 in the middle.

Sensors (touch sensors: pressed/released, light sensors the intensity level) may be polled or they may trigger an event (touch sensors: when pressed or released, light sensors when crossing a predefined trigger level from dark to bright or from bright to dark).

Since in reality constructing the obstacles is not part of the application, in RobotSim we perform the setup of the robot environment using static methods of the RobotContext class at the beginning of the program that may be easily deleted or commented out for the real robot.

(Using static instances in Jython is somewhat dangerous, because initialization is done when the class is loaded. This happens when the TigerJython IDE starts. If you get expected results when you change your code, restart the IDE).

In the next examples four obstacles are shown (bars defining a rectangular area). The robot equipped with a touch sensor moves back and turns if the touch sensor is pressed. The standard JGameGrid navigation area is shown and it allows the developer to pause or execute the simulation in single steps Increasing the simulation speed may cause troubles, because the obstacles may be missed.

# SimEx04.py
# One touch sensor, polling
from simrobot import *                         

RobotContext.showNavigationBar();
RobotContext.useObstacle("sprites/bar0.gif", 250, 200);
RobotContext.useObstacle("sprites/bar1.gif", 400, 250);
RobotContext.useObstacle("sprites/bar2.gif", 250, 400);
RobotContext.useObstacle("sprites/bar3.gif", 100, 250);

robot = LegoRobot()
gear = Gear()
ts = TouchSensor(SensorPort.S3)
robot.addPart(gear)
robot.addPart(ts)
gear.setSpeed(30)
gear.forward()
while not robot.isEscapeHit():
  if ts.isPressed():
    gear.backward(1200);
    gear.left(750);
    gear.forward();
robot.exit()
Jytut20
Select SimEx04.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

When the real robot touches an obstacle somewhere else than at its touch sensor, an unpredictable movement results, thus we cannot simulate what happens. We consider this an erroneous application that should be avoided. In RobotSim collisions between the robot image and a obstacle produce an event and an error message is displayed in the window's title bar. This happens in the last example when the backward() time parameter is reduced to 600 ms.

The following equivalent program uses events reported by the onPressed() callback methods registered easily with Jython's name constructor parameters.

# SimEx05.py
# One touch sensor, event driven
from simrobot import *                         

NxtContext.init()
NxtContext.showNavigationBar();
NxtContext.useObstacle("sprites/bar0.gif", 250, 200);
NxtContext.useObstacle("sprites/bar1.gif", 400, 250);
NxtContext.useObstacle("sprites/bar2.gif", 250, 400);
NxtContext.useObstacle("sprites/bar3.gif", 100, 250);

def pressed(port):
  gear.backward(1200)
  gear.left(750)
  gear.forward()

robot = NxtRobot()
gear = Gear()
ts = TouchSensor(SensorPort.S3, pressed = pressed)
robot.addPart(gear)
robot.addPart(ts)
gear.setSpeed(30)
gear.forward()
Jytut21
Select SimEx05.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

In the well-known channel robot application, the robot has two touch sensors and must find its way through a channel. Depending the shape of the channel, it may be blocked or even forced to turn around. It is a tricky and interesting task to find an algorithm that works with different channels. The NxtSim package includes a few predefined obstacles that may be displayed using NxtContext.useObstacle().

It is a bad design to use endless loops without the possibility to break out. Since the GameGrid method isDisposed() returns True, when we hit the title bar's close button, we use it to terminate the while loop. When polling in a loop, it is highly recommended to slow down the poll period using Tools.delay(), otherwise you waste a lot of CPU time and a slower computer may even get blocked. Below we simulate a channel robot for using a very simple algorithm that does not work in every case.

# SimEx06.py
# Two touch sensors, complex track
from simrobot import *                         

RobotContext.setLocation(10, 10)
RobotContext.setStartDirection(5)
RobotContext.setStartPosition(100, 240)
RobotContext.useObstacle(NxtContext.channel)
   
robot = LegoRobot()
gear = Gear()
robot.addPart(gear)
ts1 = TouchSensor(SensorPort.S1) # right sensor
ts2 = TouchSensor(SensorPort.S2) # left sensor
robot.addPart(ts1)
robot.addPart(ts2)
gear.forward()
while not robot.isEscapeHit():
  t1 = ts1.isPressed()
  t2 = ts2.isPressed()
  if (t1 and t2):
    gear.backward(500)
    gear.left(400)
    gear.forward()
  else:
    if t1:
      gear.backward(500)
      gear.left(400)
      gear.forward()
    else:
      if t2:
        gear.backward(500)
        gear.right(100)
        gear.forward()
robot.exit()
Jytut22
Select SimEx06.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

The next example introduces to light sensors that measures the intensity of light falling on a small spot (2-3 mm diameter) of a photo diode. In the class LightSensor the spot is just one pixel wide and reports the intensity (in the HSB color model (see relationship between RGB and HSB) scaled to 0..1000) of the background pixel at this specific point. A light sensor may be polled to get the current light level. As seen in the following example the callback methods onBright() and onDark() may be registered using named parameters in the LightSensor constructor. Like with TouchSensor, the sensor constructor takes a SensorPort reference that indicates where the sensor cable is plugged-in. In RobotSim it defines where the sensor is shown: SensorPort.S1 is shown at the robot front right side, SensorPort.S2 at the left side and SensorPort.S3 in the middle.

In many standard robot examples the light sensor is mounted to detect the reflected light from the floor where dark or bright areas are drawn. In RobotSim the GGBackground of the underlying GameGrid is used, so some minor knowledge of how to draw a background image is necessary. In order to separate this setup code from the rest of the program that should compile and run on the real robot, the code is put into the method init(gg). To display the background, init(robot.getGameGrid()) must be called. By deleting or commenting this line, the code runs also for the real robot.

In the following example the robot should follow the border of a black area. The program is event driven and the callback methods makes the robot move on a right or left circle.

# SimEx07.py
# One light sensor, event driven
from simrobot import *                         

def init(gg):
  bg = gg.getBg()
  bg.setPaintColor(makeColor("black"))
  bg.fillArc(Point(250, 250), 50, 0, 360)
  bg.fillArc(Point(250, 350), 100, 0, 360)

def onBright(port, level):
  gear.rightArc(0.15)

def onDark(port, level):
  gear.leftArc(0.15)

robot = NxtRobot()
init(robot.getGameGrid())
gear = Gear()
robot.addPart(gear)
ls = LightSensor(SensorPort.S3, bright = onBright, 
                                dark = onDark)
robot.addPart(ls)
gear.setSpeed(30)
gear.forward()
while not robot.isEscapeHit():
   pass
robot.exit()
Jytut23
Select SimEx07.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

In the next example we simulate the classic "track follower" where the robot must move on a black track drawn on the floor. Because two sensors are used, it is better to poll the sensors than using events. Again the polling loop is terminated when the window is disposed by hitting the title bar's close button. The algorithm used here may fail in the cross-over area of the two circles.

# SimEx08.py
# One light sensor, event driven
from simrobot import *                         

RobotContext.setStartDirection(10)
 
def init(gg):
  bg = gg.getBg()
  bg.setPaintColor(Color.black)
  bg.fillArc(Point(250, 150), 120, 0, 360)
  bg.setPaintColor(Color.white)
  bg.fillArc(Point(250, 150), 60, 0, 360)
  bg.setPaintColor(Color.black)
  bg.fillArc(Point(250, 350), 120, 0, 360)
  bg.setPaintColor(Color.white)
  bg.fillArc(Point(250, 350), 60, 0, 360)

def onBright(port, level):
  gear.rightArc(0.15)

def onDark(port, level):
  gear.leftArc(0.15)

robot = LegoRobot()
init(robot.getGameGrid())
gear = Gear()
robot.addPart(gear)
ls1 = LightSensor(SensorPort.S1)
ls2 = LightSensor(SensorPort.S2)
robot.addPart(ls1)
robot.addPart(ls2)
gear.forward()

v = 500
r = 0.3
while not robot.isEscapeHit():
  v1 = ls1.getValue()
  v2 = ls2.getValue()
  if (v1 < v and v2 < v):
    gear.forward()
  if (v1 < v and v2 > v):
    gear.rightArc(r)
  if (v1 > v and v2 < v):
    gear.leftArc(r)
  if (v1 > v and v2 > v):
    gear.backward()
robot.exit()
Jytut24
Select SimEx08.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

RobotSim also simulates the sound sensor by using the standard computer's audio input line. Modern notebooks have an integrated microphone, but when the simulation is run on a desktop computer, an external microphone is necessary, Keep in mind that the audio options (device, level) have to be properly set. In the following example, the robot waits until a sound is emitted (whistle, clap), then it moves forward. If it hits the wall, it returns. At the next sound event, the robot stops.

# SimEx09.py
# One sound sensor, one touch sensor, event driven
from simrobot import *                         

RobotContext.useObstacle("sprites/bar0.gif", 250, 50);

def onPressed(port):
  gear.backward()

def onLoud(port, level):
  if gear.isMoving():
      gear.stop()
  else:
      gear.forward()

robot = LegoRobot()
gear = Gear()
robot.addPart(gear)
ss = SoundSensor(SensorPort.S1, loud = onLoud)
ts = TouchSensor(SensorPort.S3, pressed = onPressed) 
robot.addPart(ss)
robot.addPart(ts)
Jytut25
Select SimEx09.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

The light sensor may be used for color detection because different colors emit different light intensities. Some prior tests are necessary to choose colors whose luminosities can be differentiated.

# SimEx10.py
# One light sensor, one touchsensor, road follower
from simrobot import *                         

RobotContext.setStartPosition(40, 460)
RobotContext.setStartDirection(-90)
RobotContext.useBackground("sprites/roboroad.gif")
RobotContext.useObstacle("sprites/chocolate.gif", 400, 50)
 
robot = LegoRobot()
gear = Gear()
ls = LightSensor(SensorPort.S3)
ts = TouchSensor(SensorPort.S1)
robot.addPart(gear)
robot.addPart(ls)
robot.addPart(ts)
ls.activate(True)

while not robot.isEscapeHit():
   v = ls.getValue()
   if v < 100:  # black
      gear.forward()
   if v > 600 and v < 700:  # green
      gear.leftArc(0.1)
   if v > 800:  # yellow
     gear.rightArc(0.1)
   if ts.isPressed():
      gear.stop()
      break
robot.exit()
Jytut28
Select SimEx10.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

We simulate the ultrasonic sensor for distance measurements in two dimensions. We assume that the sensor determines the distance to the nearest reflecting target within a two dimensional cone (beam area) with its apex at the sensor location, the axis direction perpendicular to the sensor face and a fixed opening angle (beam width, aperture).

 
us-sensor
target
 
Ultrasonic beam definition
Target

Targets are polygons that are detected by using a mesh of triangles drawn from an inner point (center). The Target class is derived from the Actor class and displays an attributed sprite image, but this image is not used for the target detection. The implemented distance detection algorithm uses the mesh and returns the distance to the nearest mesh triangle within the beam area. The number of targets is not limited and the targets may be created or modified at runtime.

Normally a fixed target layout is defined using the RobotContext class. This makes the code for the robot movement almost identical to the code for the real robot, either in direct or autonomous mode. In the first example the robot searches a target by a radar like movement. When a target is found, the robot moves to the target and stops some distance away.

# SimEx14.py
# Poll ultrasonic sensor
from simrobot import *                         

mesh_hbar = [[200, 10], [-200, 10], 
   [-200, -10], [200, -10]]
mesh_vbar = [[10, 200], [-10, 200], 
   [-10, -200], [10, -200]]
RobotContext.useTarget("sprites/bar0.gif", 
                     mesh_hbar, 250, 100)
RobotContext.useTarget("sprites/bar0.gif", 
                     mesh_hbar, 250, 400)
RobotContext.useTarget("sprites/bar1.gif", 
                     mesh_vbar, 100, 250)
RobotContext.useTarget("sprites/bar1.gif", 
                     mesh_vbar, 400, 250)

robot = LegoRobot()
gear = Gear()
robot.addPart(gear)
us = UltrasonicSensor(SensorPort.S1)
robot.addPart(us)
us.setBeamAreaColor(Color.green)
us.setProximityCircleColor(Color.lightGray)
  
gear.setSpeed(15)
gear.forward()
    
while not robot.isEscapeHit():
  distance = us.getDistance()
  if distance > 0 and distance < 50:
     gear.backward(2000)
     gear.left(1000)
     gear.forward()
robot.exit()
Jytut27
Select SimEx14.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.

More than one ultrasonic sensor may be used, but the proximity circle can only be used for one of the sensors. The SensorPort parameter determines the beam axis: SensorPort.S1 points forward, SensorPort.S2 to the left and SensorPort.S3 backward. In the following example the second sensor us2 attached to SensorPort.S2 reports the distance values in the console.

# SimEx14a.py
# Two ultrasonic sensors
from simrobot import *                         

mesh_hbar = [[200, 10], [-200, 10], 
   [-200, -10], [200, -10]]
mesh_vbar = [[10, 200], [-10, 200], 
   [-10, -200], [10, -200]]
RobotContext.useTarget("sprites/bar0.gif", 
   mesh_hbar, 250, 100)
RobotContext.useTarget("sprites/bar0.gif", 
   mesh_hbar, 250, 400)
RobotContext.useTarget("sprites/bar1.gif", 
   mesh_vbar, 100, 250)
RobotContext.useTarget("sprites/bar1.gif", 
   mesh_vbar, 400, 250)

robot = LegoRobot()
gear = Gear()
robot.addPart(gear)
us1 = UltrasonicSensor(SensorPort.S1)
robot.addPart(us1)
us1.setBeamAreaColor(Color.green)
us1.setProximityCircleColor(Color.lightGray)
us2 = UltrasonicSensor(SensorPort.S2)
robot.addPart(us2)
gear.setSpeed(15)
gear.forward()
    
while not robot.isEscapeHit():
  distance1 = us1.getDistance()
  distance2 = us2.getDistance()
  print distance2
  if distance1 > 0 and distance1 < 50:
     gear.backward(2000)
     gear.left(1000)
     gear.forward()
robot.exit()
Jytut26
Select SimEx14a.py (Ctrl+C to copy, Ctrl+V to paste)
Execute the program locally using WebStart.