The program was written for the LEGO EV3 Brick. It is the base of every Mindstorm EV3 robot. Additionally, for identification of an object / color, a camera from an external company is used.
The camera used for this program is the **Pixy2** for Lego Mindstorms. To work with the camera, you have to teach it the object you want to sort. For this, you need PixyMon on your PC. Additionally you need to set up the Pixy2 and "install" the module `pixycamev3`. You can download the needed file form the specific PixyCam GitHub but it is also inculded into our repository.
If you want to get detailed instructions on either of these points, look in the [Quick Start Guide](https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:pixy_lego_quick_start) on the PixyCam Website.
The robot drives forward when the wheels turn counter-clockwise. If you use a robot with a clockwise direction of rotation, you need to change the parameters given in the arguments to `'normal'`. (See Additional Parameters in [Initialization](#initialization))
To track the distance between the robot and the cube, a LEGO Ultrasonic Sensor is used. To identify the outer line of the sorting area, one LEGO Color Sensor is used. For keeping track of the rotation on the back side of the robot, a LEGO gyro sensor is mounted.
The sorted objects are simple cubes made of paper. You can find the tutorial for folding the cube on [YouTube](https://youtu.be/VjooTcZRwTE?si=HaiStBDw1cQu3K7o). To ensure the ultrasonic sensor recognizes the cube, it is 5.5 cm high and long. The two colors should be distinct from each other.
The purpose of the program is to sort different objects. In detail, this means that the robot should push all orange cubes out of the sorting area and leave all green cubes inside.
The main part of the program is the class "Object_Sorter". Due to the fact that the robot gets a lot of inputs and does not need to check everything at the same time, the class contains five different state methods. The robot will switch automatically between the states when certain events happen. You can identify the current state by looking at the LEDs on the EV3-brick.
**Robot has a cube in his arms**: When the Ultrasonic Sensor detects an object in front of it, this means there is already a cube in its arms, and it switches to state "[Sort](#method-sort)".
**Robot spun without finding something**: When the robot has turned for 2 full rounds (720°) and has not detected one of the four things above, it will drive a bit forward. If it reaches the outline, it changes to the state "[Edge](#method-edge)". Otherwise, it starts the state "[Search](#method-search)" again at the new spot.
To make that happen, the method checks the current position of the cube and corrects the driving direction if necessary. Due to the fact that there are several reasons why the camera does not consistently recognize the cube, the robot has a routine to find the cube again (driving backwards, turning left). If the routine does not work, the program changes back to state "[Search](#method-search)".
Otherwise, the ultrasonic sensor detects the cube in the arms, and the program changes to state "[Sort](#method-sort)", or the robot moves over the line, and it switches to state "[Edge](#method-edge)".
The task of the method is to drive the cube to the outline of the sorting area and avoid the green cubes on the way.
If the camera detects an green cube, the program switches to the state "[Avoid](#method-avoid)", whereas if it doesn't, the robot continues straight ahead until the outline is reached and the program changes to the state "[Edge](#method-edge)".
The detected green cube should be avoided by the robot. Therefore, the camera looks for the green cube again and gets its position. If it is out of reach of the robot, the drive path will not change, but if the cube is on the way, the robot corrects its driving direction. After that, the program switches back to the state before. ("[Search](#method-search)" or "[Sort](#method-sort)")
The robot has reached the outline. It drives backward and turns 100° to the right. The program switches back to the state "[Search](#method-search)" afterward.
### Method: Run
This method only is used to start the operations of the object sorter and continue in an endless while-loop.
During programming, I was challenged by the problem that the outer line is sometimes not detected. After trying different approaches to fix the bug, I suspect that the processor of the EV3 is slowed down by the camera and, for this reason, does not detect the line when the sensor is above it.
Another reason can be that the arms project shadows or the line doesn't differ enough from the floor. Make sure that the given values and the physical structure of the robot make sense for your circumstances.
You can change all the input values such as the minimal line reflection and the driving rates with a dictionary of the arguments given to the class. (See [Initialization](#initialization))
If the camera does not detect the cubes, there are multiple reasons. Sometimes, due to lightning changes, the camera does not identify the different colors correct. You can go back to PixyMon and see if the cubes are even detected. If not, you have to teach the object / cube to the camera again, preferably in the environment where the sorting happens. Also, there is a [section](https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:some_tips_on_generating_color_signatures_2) on the PixyCam Website about improving the detection accuracy you should read.
The ultrasonic sensor should be mounted as close to the floor as possible to make sure the cube gets detected. You should build the arms in such a way that they are not in the sensor's field of vision. But they should hold the cube as centered as possible in front of the robot. Make sure that you have followed all the construction advice.