• No results found

3.2 Papers

3.2.3 Paper III

3D Vision Guided Robotic Charging Station for Electric and Plug-in Hybrid Vehicles

During the exchange at TU Graz in Austria, a project was started to develop an autonomous robotic electric vehicle charging station. The project has targeted to solve issues becoming more relevant with a growing number of electric vehicles (EVs). One of them being many fully charged vehicles left at the charging stations and occupying the chargers, while other drivers are waiting in the queue. On top of that, more powerful fast chargers are being developed allowing the users to charge their EV for a long trip in under an hour. However, putting 120 kW or higher power requires very thick, durable and heavy power cables. For example, the weight of a 200 kW charging cable can be up to 2.26 kg/m. With longer cable lengths, this becomes difficult for people to handle, but would not be an issue for a robot [41]. The final problem comes with having many different standards for the charging sockets. It is quite common for EV owners to face a situation, where they arrive at the quick charging station to find out that the offered plug does not match the charging port of their car. In such circumstances, they either cannot charge, or have to carry some bulky adapters to ensure they can plug-in at most of the charging stations.

The project started by creating a robotic electric vehicle charging station con-cept. The hardware setup, robot operation, vehicle and charging port detection, the whole process work-flow and necessary communication channels between the de-vices together with the graphical user interface for the driver had to be considered.

The concept of the charging station can be seen in Figure 3.7(a). After the initial

design of the charging station, an actual BMW i3 vehicle was acquired for testing purposes and the charging station was constructed as shown in Figure 3.7(b). The following work focused on the algorithms for the detection of the charging port and the motion planning for the robot plugging in and out the charger.

(a) 3D design of the robotic electric vehicle charging station concept.

(b) The actual robotic electric vehicle charging station used for testing with a BMW i3. Image source [42]

Figure 3.7. EV charging station project, from concept to reality.

In order to allow the robot to precisely plug in the charging cable into the vehicle, an exact location and orientation of the charging port have to be deter-mined. After testing RGB-D cameras, it was noticed that the material charging ports are made of absorbs IR light making it very difficult to get any useful depth information using these cameras. Also, the system has to be functional outdoors, where there can be infrared disturbance from the sunlight. The camera setup was shifted to use stereo cameras. Such a configuration is more robust to changing illumination conditions; however, getting depth information is more challenging.

To identify the charging port type and identify many distinct keypoints for

the triangulation between the two cameras, shape-based templates were created

for each of the plug types. By using template matching algorithms included in

Halcon Machine Vision software, a precise overlay of the generated template and

the image of the charging port was achieved in both camera images, as shown in

Figure 3.8(a). Model matching for Type 1 and Type 2 charging ports as well as the

connector plug (Type 2) has worked well for various illumination and angles up

to 45

relative to the viewing angle of the camera. The detection distance was up

to 2.5 meters, which matched the reachability of the UR10 robot. The matching

confidence score for proper alignment was over 95% . The recognition speed on

the full camera image was varying between 300ms and 800ms . By narrowing

down the search area, for example by identifying the darker than normal regions in the image, the recognition speed can be reduced to under 150ms.

(a) Results of the template matching. A high variety of angles and lighting conditions were tested. Viewing angles up to45 resulted in successful detection with accuracy dropping beyond that. Row 1: Type 2 connector plug.

Row 2: Type 1 socket. Row 3: Type 2 socket.

(b) Hand-Eye calibration results. Visualisation of the assigned coordinate frames to the vi-sion sensor, the end-effector of the robot and the end point of the connector plug. Result-ing point cloud visualisResult-ing the chargResult-ing plug is overlayed onto the visualisation of the robot model.

Figure 3.8. Using template matching for charging port and plug detection.

Given the accurate detection in both stereo cameras, keypoints aligned with the pins of the charging port or plug were used to calculate the depth information using triangulation. It resulted in a precise pose estimation of the charging port, which included 3D coordinates of the centre of the plug as well as orientation as roll, pitch and yaw angles.

Having an accurate detection of the charging plug, also allowed to perform a marker-less Hand-Eye calibration using the centre point of the connector plug as the reference point. Instead of using markers like a checkerboard for the calibra-tion, the structure of the plug is used to get the keypoints while the robot moves it around to acquire the needed accuracy of under 1.5mm. It becomes useful if the robot has interchangeable connector plugs of different types so that the system can re-calibrate fully automatically. Results of the successful Hand-Eye calibra-tion based on a connector plug structure is visualised in Figure 3.8(b).

Once the system is calibrated and the pose of the charging port of the vehicle is detected, the three-step robot plug-in movement is initiated, as seen in Figure 3.9.

Firstly, the robot moves the plug at high velocity to the approach position, which is

within a 0.1 meter radius from the charging port. The second step is to reduce the

velocity to 10% of the maximum robot joint speed and move to the final alignment position. In this pose, the connector plug and the charging port are fully aligned by their Z-axis and just a few millimetres away from the contact point. The last step is to move at only 2% of the maximum speed along Z-axis and perform the plug-in motion. During this move, the forces and torques exerted on the end effector of the robot are monitored. In case the forces exceed a given threshold, the system is halted to prevent any damage.

Figure 3.9. Three step plug-in procedure plan. Firstly, the robot moves the connector plug to the Approach Position, which lies approximately0.1meter away from the charging port. The second move aligns the Z-axes of the charging port and the plug and gets the plug just a few millimetres away from the port. The final plug-in movement performs the plugging in motion along Z-axis.

Under the assumption that the vehicle was stationary during the charging pro-cess when the battery is fully charged, the unplugging motion is merely the inverse of the plug-in movement by retracing the same trajectory steps in the reverse order and returning the end-effector to the standby position.

The concepts have been proven to work in 10 experiments under different rotation angles that the plug was placed at. It has worked successfully 8 out of 10 times, with it failing twice due to the rotation misalignment. Even with small angular offsets of up to 5°, the plug was inserted successfully making contact.

The work has resulted in the actual working robotic electric vehicle charging

station, where our approach has proven to work under varying lighting

condi-tions. The project has been successfully submitted to the partners and was

re-leased publicly in September 2018 followed by some mainstream media articles

about it [42]. Furthermore, a patent citing this work was published in 2019 by

Intel Corporation [43].