360 Degree Perception for Autonomous Vehicles

By TuSimple's editorial team
Jun 09


The limits of human visual perception are rarely more evident than when driving. From your perspective in the driver’s seat, you may feel like you have a clear view of the road ahead of you, but your vision is limited by the physical barriers of the car itself. The frame of the car obstructs your view to the sides while the hood keeps you from seeing objects low to the ground directly in front of you. Meanwhile, your vision to the sides and rear is further compromised not only by the car itself, but your forward-facing position in the seat.

Of course, drivers have adapted to these restrictions and technology has helped to fill in the gaps with an array of mirrors, cameras, and sensors to alert us to possible collisions. Even so, we can only focus on one direction at a time. Furthermore, the gaps in vision are magnified by the size of the vehicle. The rear blind spot for small cars is roughly 15 feet, but can be 25 feet for pickup trucks and SUVs and even longer for tractor-trailers. TuSimple’s fleet of autonomous trucks must deal with these types of limitations as well as the added complication of working with a multitude of trailers.

Just as with human drivers, the accurate perception of surroundings is vital for autonomous vehicles to make safe and efficient navigation decisions. While autonomous driving systems have advantages in the various types of sensors that they can employ and their ability to quickly process sensor data from all directions, there are also obstacles that need to be overcome. TuSimple is constantly innovating and improving solutions and technology to optimize both the perception and interpretation of the surroundings.

Simply covering a vehicle with sensors is neither practical nor effective. Ideally, the sensors will provide a 360-degree view around the truck without overwhelming the vehicle with redundant data that must be filtered out. Our US Patent No. 11,076,109 B2 describes systems for semi-autonomous and autonomous control of vehicles relating specifically to optimal sensor layouts. These systems could include at least three forward facing cameras of different focal lengths to cover the front of the tractor, left and right side cameras to cover the sides of the tractor or the trailer, and at least two backward-facing cameras that cover the rear of the tractor. Additional optimizations are made for an array of radar and LiDAR systems supporting both short- and long-range detection capabilities.

Similarly, simply pointing cameras to the side of a vehicle is not sufficient for adequate object recognition. Object detection and identification systems in autonomous vehicles can be based on categorization of objects within two-dimensional bounding boxes applied to images from the cameras. In the case of lateral (cameras on the right and left of the truck) perception, the angled views create distorted images. When multiple cameras are used together, this further compounds the issue. Fortunately, TuSimple has again anticipated the issue and created a novel solution. Our US Patent No. 10,685,239 B2 and US Patent No. 11,074,462 B2 describe a system and method for detecting and correctly evaluating the boundaries of objects to the side of a vehicle. This system is able to transform the incoming images from lateral cameras based on a line parallel to the side of the vehicle. This further allows accurate object extraction and appropriate application of bounding boxes. Additionally, this system can combine transformed images from multiple sensors to create a comprehensive view within the corrected perspective.

Our US Patent No. 11,023,742 B2 considers the issues of rear-facing perception systems. This patent describes the benefits of rear-facing perception systems that require little or no installation on the many potential cargo holds or trailers that might be attached to the tractor. These systems could provide complete rear perception without blind spots in addition to wireless communication methods, such as Ultra High Frequency (UHF), Low Frequency (LF), and Radio Frequency IDentification (RFID), between the sensors and the control unit and simple, automated setup. While some elements like the sensors themselves must remain on the trailers, permanent parts of the system could be located on the tractor itself, eliminating potential issues when swapping equipment between trailers.

To ensure these systems are properly calibrated, our US Patent No. 10,837,795 B1 discloses a technique for calibrating cameras. This system involves precise calibration based on analyzing distances to laser pulse groups and solving linear equations. These are just a few examples of the innovation TuSimple has attained in perceiving objects all around both the tractor and trailer. The combination of these and other technologies gives our trucks a complete view of the surroundings enabling better navigation and improving safety for everyone around our vehicles.

In the first quarter of 2022, 21 new patents were awarded to TuSimple, giving us a total of 408 patents globally. These latest patents cover a variety of fields including mapping, simulation, and automated assistance systems. TuSimple is consistently innovating autonomous driving technologies, offering practical solutions to numerous complex issues facing the industry. As always, our ultimate goal is to ensure the highest quality freight network while continuing to lead the industry in safety, reliability, efficiency, and sustainability.

Posted by TuSimple

Recent Blog Posts