Friday, October 8, 2010

Lab session 6 : Robot race on the Alishan track

Date: 7. October 2010
Duration of activity: 17 hours !
Group members participating : Anders Dyhrberg, Michaël Ludmann and Guillaume Depoyant.



I. Objectives and summary

I.1 Objectives

There is only one objective to accomplish during this lab session, that is we have to build and program a robot able to run as fast as possible on the Alishan Track (see picture below).


The robot starts in the lower level, in a green square (green is missing on this picture), and has to go on the upper level, before going back to where it started.

This session is part of a competition, and we had to record the time (in milliseconds) taken by the robot to fulfill this task. We were free to use as many LEGO elements as we wanted to use, as long as the whole robot was contained in the start area.

I.2 Summary

We first agree on how to build te robot and decided of our strategy to solve the problem.
We used three light sensors to track the line (actually, we needed only two) and a tilt sensor to detect slopes and platforms.

I. Building the robot

Basically, we used the same robot than we craft for all the other experiences. We changed the wheels so as to  get a better grip on the floor and, tough part, we had to create a system in order to set the sensor knowing that the inclinaison of the path could change the values sampled.

So from here, the first idea was to create an elbow connection between the robot and the sensor. Indeed, this first approach seems to be the best because with any inclinaison, the sensor would have the same inclinaison with the floor. But instead of making that, we tried to make something quite different. We linked the sensor and the robot with a parallelepiped link. This way the liaison was a little bit more reliable but depending on the inclinaison, the sampled value could change (we approximate that the inclinaison wasn't that important on the path we had to take).



II. Strategy

Instead of creating a mere script to hard code the path the robot has to follow, our priority was to program a more satisfying robot that could react to what it really "sees", helping it a little by letting it expects some patterns. That meant that we first focused of having our robot works correctly using its sensors, before trying to speed it up (even if we always had this in mind).

Because we knew what the challenge was, because the task to complete is not so complicated and the path to follow is straightforward, and because we wanted to be able to each program at the same time something different, we split the code into a sequence of tasks.

Besides, in order to have an approach different from our usual line follower, we wanted to use our own trick while having to turn on a horizontal plateform. The track at those points is indeed known to be tricky, because the path splits into a "Y" shape, which can easily end in having the robot following the wrong path and falling from the track, if using light sensors.

This means that, at the beginning, the robot knows that it will first go on a slope and have to follow a line. Then it will encounter a plateform, will detect it, and will have to turn right, before catching the line on the slope again and driving upward to the next plateform, and so on. Once arrived on the fourth plateform, that is, the upper one, it has to do a half turn and go back the track. Once the last (bottom) plateform is reached, the robot stops and displays the result.

Basically, the main part of our program [1] looks like that :


prg.setC(); //Calibration of light sensors
           
        while(!Button.LEFT.isPressed()){}
        Thread.sleep(1000);
        prg.setTiltOffset(); // tilt sensor knows now what a plateform is
                 
      Stopwatch stopwatch = new Stopwatch();
                 
        prg.moveForwardTacho(200); // move forward to reach the slope
                 
        LCD.clear();
        prg.trackLineSegment(); // track the line until new plateform
                 
        th.Turn(Direction.RIGHT, lL, lM, lR); // turn right until new slope reached
        prg.trackLineSegment();
        th.Turn(Direction.LEFT, lL, lM, lR);
        prg.trackLineSegment();
        prg.followLineTacho(400); // follow the line on the upper plateform for a limited number of rotations
        th.Turn180(lL, lM, lR); // make a half turn
        prg.trackLineSegment();
        th.Turn(Direction.RIGHT, lL, lM, lR);
        prg.trackLineSegment();
        th.Turn(Direction.LEFT, lL, lM, lR);
        prg.trackLineSegment();
        prg.moveForwardTacho(300); // move forward on the lower plateform to make sure the robot is fully contained in the green zone

The robot won't start an other task in the sequence before making sure it has to do so. Therefore it will not do a right turn before being on a horizontal plateform, which will be check by the previous line follower part.

III. Implementation

III. 1. Following the lines

III. 1. a. Calibrating the sensors

Using more than one sensor, we cannot assume they have all the exact same constants and return the same value while scanning a surface. This entails that we have to take differents thresholds into account. Moreover, we know that read values are never the same depending on ambient light (even if we tried to physically reduce this "noise") and battery level. On top of that, calibrating three sensors mouted so closed on the same robot is somewhat sometimes challenging. We therefore need to have a robot able to calibrate itself.

Calibrating means saving minimal and maximal values seen by each sensor on our track. So, during one loop, we take care of only one sensor, and we scan repeatedly the line looking for the highest and lowest values. We made it also be able to find by itself what is the black line and what is the white surface, by having it looking or a significant drop in read values. This routine is done many times, so we can get a good average of our thresholds, since there always are some variation in the readings.

The method uses also a pulse width modulation [2] (PWM) to control the motors. We have heard from this during one lecture, and it was a good way to try it there. It worked great, and the robot behaved pretty well during the scanning calibration.










void scanLineMotorControl(int iDir) throws InterruptedException
        { // PWM (Pulse width modulation)
          if (iSLMC == 0) {
            if (iDir == -1) {
              // motor left
                LEFT.setPower(POWER_SCAN);
                LEFT.backward();
                RIGHT.setPower(POWER_SCAN);
                RIGHT.forward();
            } else {
              // motor right
                LEFT.setPower(POWER_SCAN);
                LEFT.forward();
                RIGHT.setPower(POWER_SCAN);
                RIGHT.backward();
            }
            iSLMC = 2;
          } else {
            // brake
                  offAll();
                  iSLMC--;
          }
          Thread.sleep(10);
        }

III. 2. b. Tracking a line

Line tracking was done using only two of the three sensors we had (the rightmost and the leftmost). We have coded a basic algorithm, which gets what we called the position of the robot by taking the difference between the right and left values read by the sensors. After adding a constant to this difference, we get a value which is equal to zero when we are on the line, negative when we are left to the line, and positive when we are on its right side. Using this information, we control the motors to go left or right. 

The middle sensor was then used to go straight forward if the position value is close to zero. This algorithm was enough to follow the straight line at a fair speed. We however encountered some difficulties to catch the line again after having made a turn, because the robot was inclined to overshoot the line (it sees it, but does not have the time to correct itself enough). Some good programming in the turn part solved this problem.


III. 2. Detecting platforms

III. 1 . a. Tilt sensor

We decided to use a acceleration/tilt sensor [3] to detect plateforms and therefore know when to turn. A tilt sensor is able to measure tilt in three axis (gravity is indeed perceived as acceleration). To detect a change in the robot's tilt, we have to repeatedly look for the x value, x being the axis along which the robot is moving while going forward. Since we wanted to know as soon as possible when we reach a plateform, we needed to put the sensor in front of the robot, on a part which moves when the sloping is different. That is why we put it on the light sensors block.


You can see that the position of the tilt sensor is different enough to tell us whether the robot is one a slope or not.

III. 2. b. Slopes and plateforms

The problem we had to cope with while using this sensor is that we sometimes got random values that are clearly wrong because far from the usual values (e.g. our normal values on the x-axis range from about 15 to 35, whereas we could sometimes get something like 1020). This is often due to sharp movements while following a line, which can be seen like bumps, therefore as accelerations and tilts. So we couln't afford to just wait for the value to be larger than a defined threshold in order to detect a slope.

Here is what we ended with. When we start the robot on the lowest plateform (green zone), we wait for one second (making sure nobody is touching the robot), we first look for the tilt value, and set it as the default value (offset) for a plateform (all four plateforms are horizontal and therefore have quite the same slope). Then, we look for the next plateform, but not before having been on a slope before (we keep a status structure up to date for that).

While we are sampling the values returned by the tilt sensor, we throw away faulty values (that are below and beyond a threshold depending on our offset), and we add the others to a queue. Once the queue reach a size of 15 elements, we begin to make an average of the values inside. Each time we add a new element, we delete the older one in the queue to just keep recent values in memory (we want to react to the recent past). So, to detect a flat surface, we now compare the offset value and the newly computed average value. If the difference is below a defined threshold, we can be sure we are on the sought plateform (and it is not a random value due to some bump, because the last 15 values are quite the same on average - moreover, remember that really faulty values are even not in the queue). If the difference is below a defined threshold, that means we are on a slope and we can start looking for a flat surface.




tiltv = tilt.getXTilt();
                          if(tiltv <= tiltOffset + 20 && tiltv >= tiltOffset - 20)
                          {
                                queue.push(tiltv);
                                discarded = false;
                          } else {
                                  discarded = true;
                          }
                          if(queue.size() >= 15)
                          {
                                  aveTilt = getAverage(queue);
                                  if(m_state == state.lookingForInclination)
                                  {
                                          if(Math.abs(aveTilt - tiltOffset) >= 6)
                                          {
                                                  m_state = state.lookingForFlat;
                                                  il.log("*lookingForFlat*");
                                          }
                                  } else {
                                          if(Math.abs(aveTilt-tiltOffset) <= 2)
                                          {
                                                  Car.stop();
                                                  il.log("=Flat Found=");
                                                  break;
                                          }
                                  }
                                  queue.pop();
                          }

We needed to tune our values, especially the size of the queue, in order to have a rabot reacting quickly and accurately enough at the same time/

III. 3. Turning right / left
Anders

III. 4. Turning back
Anders

III. 5. Role of Bluetooth logger / debugging
Anders (feel free to organize all your parts differently and add other topics :) ).


IV. Results and proofs


After hours of calibration, tuning, trials and errors, rewriting of code and testing, here are finally our results for the robot race : 


First one : 38.719 sec


Another video : 39.123 seconds

We also managed to get a time of 38.442 seconds, but we weren't taking a video :


V. References
[1] Remember that you can check out all our source code on our Google code webpage. For this competition, just browse through lesson 6 directory.
[3] Tilt sensor documentation on HiTechnic

No comments:

Post a Comment