Cognitive boss on AV safety: ‘It’s about human life, not just big money’
What is Cognitive Technologies’ strategy?
Cognitive Technologies’ strategy for the future is based on maximising the technological effect of all the company’s developments. Our main competitive advantage is the development of artificial intelligence (AI)-based systems for ground transport. The flagship system is called C-Pilot (Cognitive Pilot) and it now matches SAE Level 4. According to recent technical results, we are now among the top three world leaders in this sphere. The so-called ‘killer’ feature of our system is operation in off-road, snow, rain, fog and other harsh weather conditions.
How long have you been doing this?
For the last seven years. Now we have over 17 inventions and patents around the globe in such sectors as agriculture (harvesters, sprayers, tractors), rail transport (trains, locomotives and urban electric trams) and cars (city vehicles, trucks and mining dump trucks). Tremendous efforts were done and now we can confidently state: ‘Yes, our system works in any weather and better than a human driver’. Our strategic approach is based on the data coming from video cameras and radars – not from Lidar, which are used by most of other developers around the world. I know it’s not the easiest way, but this is our choice. I believe that you can’t dominate the world in this sphere by choosing easy ways.
Can you explain more about Cognitive’s ADAS work?
The gradation in this zone is as follows: the priority task is the dominance of our C-Pilot artificial brain in the world. We have already started working in this direction and have got several large contracts with international manufacturers and Tier 1 suppliers. For example, for Hyundai Mobis we have recently developed a unique software module to facilitate safe autonomous driving. The computer vision-based software module is capable of recognising various classes of objects moving along the roads such as cars, buses, motorcyclists, cyclists and pedestrians. The core technology used is the deep learning neural networks that provide the system with the required recognition accuracy. Soon this module will be integrated in the control system of Hyundai luxury vehicles, making them even smarter and safer.
What about self-driving?
on creation of a fleet of self-driving taxis for one big metropolitan city. All of the mentioned projects assume the industrial use of Level 4 automation. But all these areas will still undergo some very serious transformations due to the lack of elaborated legislation and certification for robocars in most of the countries. Therefore, for the next 15 years this kind of work will have commercial risks.
Which territories are you targeting?
In the automotive sector we are interested in the markets of Korea, China, Germany and the US. Here we primarily offer Level 4 ADAS and the new generation 4D imaging radar as an alternative to Lidar. The radar detects objects at a distance of 300m in the range of azimuth angles greater than 100 degrees. It’s the size of just two iPhones and costs $130. Even in the worst possible weather it has the accuracy of object detection over 97.7%. In the sphere of railway transport the most interesting for us markets are: Russia, China, Germany, Israel and Indonesia. In these countries we are developing autonomous control systems for locomotives and urban trams. For example, in Moscow we’ve already started a joint project with PC Transport Systems on the development of a fully autonomous urban electric tram. The autonomous tram can detect other vehicles and trams, traffic lights, pedestrians, tram/bus stops and switches on the tracks. The tram’s AI vision computer system is supplied with visuals from 10-20 cameras positioned around the tram, and data from as many as 10 radar sensors.
How do you see AVs developing?
It seems to me that now the world has shaken off the hype foam from this fascinating topic of self-driving cars and has finally taken up metrics seriously. After several fatal crashes on the roads, people have finally realised that the denominator is human life, and not just big money. Companies started looking at accuracy and security. After so many empty talks about the imminent Level 5 automation and magical cheap Lidars that will work in any weather, people now focus on real practice and real projects. Of course, we all still need to solve a huge number of moral, social and legal problems. But the main contours are already visible: Level 4 ADS must see and understand the scene with an accuracy of 99.99% – and not less. A lot of interesting stuff is happening in the segment of hardware for AVs. For the operation of neural networks, on one hand, we need very compact and affordable hardware; on the other hand, these automotive processors should be very powerful. With great interest we are now testing the latest NVidia products. I think that the leader for the coming years has been determined already in this zone.
So how soon will Russia see big uptake of C/AVs?
Now Russia is very actively engaged in a smart agriculture market. In 2019 alone, autonomous combine harvesters travelled to the first five regions of the country. And it was in Russia where we signed the largest contract in the world with Rusagro Group for equipping 800 combine harvesters with autonomous driving systems. So there is a rapid robotisation and competition in the industry. With such an extensive land use, this will definitely give a significant economic effect. We have also begun robotising the Russian Railways locomotive fleet (over 12,000 machines). On the motor roads only a few experiments are conducted: the legislation is not yet ready for these kind of tests. But the interest from the Russian government for C/AVs has become very high. I hope that soon enough Russia will catch up with the US and China both in legislative support and investment in the field of AVs.
Is there much crossover between Yandex and Cognitive?
Yandex is involved in the self-driving taxi project. Cognitive Technologies produces ADS for any types of ground transport. So just like we have no intersections with Uber, for example, I don’t see any intersections with Yandex. It’s playing on a different field. We’d be happy to support Yandex if they decide to do something really ‘industrial’.
Finally, what has The Beatles’ Abbey Road got to do with all this?
Well, this is just a joke from our developers. Despite the fact that our team is very young (the average age is 27 years old), our guys are very fond of The Beatles. One of the project managers is a real fan and he has a great collection of vinyl records at home. So one day they decided to process the Abbey Road cover through our neural recognition system and here is the result. The robot has accurately detected and recognised all the objects on the cover and helped the Liverpool Fab Four safely to cross the road. Let The Beatles be alive in our hearts forever.