The Future of Transportation - Panel Discussion @ 2019 Robin.ly AI Commercialization Conference

Updated: Jul 11, 2019

Robin.ly 2019 AI Commercialization Conference featured an inspiring panel discussion about the future of transportation with tech leaders from promising autonomous driving startups.


Panelists include:

  • Alex Ren, Panel Moderator (Founder of Robin.ly & TalentSeer, Managing Partner of BoomingStar Ventures)

  • Ruslan Belkin (CTO @ NAUTO; Former VP of Engineering @ Salesforce & Twitter)

  • Louay Eldada (CEO & Co-founder @ Quanergy)

  • Nalin Gupta (Co-founder of Autonomous Driving Division (Auro) @ Ridecell; Forbes 30 Under 30)

  • Tao Wang (Former Co-founder of Drive AI)


Panel Discussion on The Future of Transportation, Robin.ly AI Commercialization 2019 (Panelist from left to right: Louay Eldada, CEO & Co-Founder @ Quanergy; Nalin Gupta, Co-founder of Autonomous Driving Division (Auro) @ Ridecell; Ruslan Belkin, CTO @ NAUTO; Tao Wang, Former Co-founder @ Drive.ai; Alex Ren, Founder @ Robin.ly & TalentSeer)

Key topics for panel discussion:

  1. The readiness of urban infrastructure for autonomous cars is one of the key criteria determining the success of the industry. How do you see the combination of autonomous cars and future infrastructure?

  2. Elon Musk once said "Anyone relying on LiDAR is doomed." However, Aurora also announced the acquisition of Blackmore for its LiDAR technology. What's your comment on this?

  3. Modeling and recognizing driver behavior based on driving data is critical to improve fleet safety. How can we actually convince drivers to trust the system?

  4. Keep pedestrian, drivers, and the riding public safe is the primary focus of all future transportations. How to promote safety without sacrificing innovative opportunities?

  5. There are AI-backed companies such as waymo and zoox; there are manufacturing companies such as GM and Ford; there are ride service companies such as uber and lyft. How do you choose your path of commercialization?

Listen to podcast on: Apple Podcast | Spotify | Google Play Music


Robin.ly is a content platform dedicated to helping engineers and researchers develop leadership, entrepreneurship, and AI insights to scale their impacts in the new tech era. Subscribe to our newsletter to stay updated for more inspiring talks and exclusive events:



Transcripts

Alex Ren:

I will let each of you to introduce yourself within two minutes and when you do, please include what your company is doing and also your ideas about the future of transportation. Okay. I will start from Louay.


Louay Eldada:

Certainly. I'm Louay Eldada, the CEO and Co-founder of Quanergy. We are a Silicon Valley company based here in Sunnyvale. We are known for making solid state LiDAR, but we do a lot more than that. We make the solid-state LiDAR that goes in the vehicle and we also make LiDARs that support smart infrastructure and smart cities for smart transport. We also do the software that relates to autonomous vehicles, so we work with people throughout the ecosystem for autonomous vehicles. We get exposed to the landscape from different perspective. We operate globally, we sell globally, and we have 12 offices globally. So, happy to share our thoughts today. Thank you.


Well, the autonomous vehicles are going to happen soon, regardless of different opinions. Now, the question is about whether you have to use this or that sensor or this or that approach when you build your software stack. That's just the process, right? The outcome, there's no doubt about it that we will have fully autonomous vehicles. Today, fully autonomous level five vehicles are available in geo fenced areas. The challenge is when you are on upon public roads in a chaotic environment, what hardware and software really do the job and avoid unnecessary accidents and actually eliminates accidents that would have otherwise happened. So, the timing for fully autonomous vehicles is around 2025. But level two, three, four, with level five being full autonomy, are happening starting in two years. Level three, four, level two already exists.


Nalin Gupta:

Yeah, thanks. My name is Nalin. I'm Co-founder of Auro Robotics, which after acquisition by another company called Ridecell, we have essentially become the autonomous driving division within Ridecell. And we work on autonomous vehicles, more specifically, we work on developing the software backbone of autonomous driving technology. I will keep it short for the questions, but speaking about the future, I couldn't be more excited about it. Not just in terms of autonomous, but also all the things which are happening in shared mobility and electric mobility. I think we are at in a very fantastic time where all these innovations are coming together. So, I'm really very excited about the whole industry. I won't put a specific time on when we will see autonomous vehicles because there are so many arguments and opinions about it. But yeah, in some cases, we will see autonomous vehicles much sooner than some other cases.


Alex Ren:

So, you guys mostly work on the geo fenced, ride-share service. Is that how you define it?


Nalin Gupta:

That's how we actually started back in 2015. When we started this company, we were focusing on developing autonomous vehicles for campus like environments like a university campus or a retirement community. But since then, as our technology has become more and more matured, we have moved away from those campus like vehicles to more public road vehicles. And now we are focusing more on large fleets of you know how we can enable low hanging fruits like empty car logistics across these massive fleets on public roads.


Ruslan Belkin:

I'm Ruslan Belkin. I'm CTO of Nauto, and it actually stands for Network Auto. What we do is make driving safer for commercial fleets by monitoring inside and outside of the vehicle. We are in more than 250 fleets right now across Europe, Japan and the United States. And I think my opinion on the future… Actually the future’s very hard to predict, but we are 90% there as far as autonomous vehicles and we have 90% to go. In terms of use specifically, I think it's a great career move because it's a secular trend. Whenever you're getting into anything having to do with autonomous vehicles, you'll benefit by one way or the other. In terms of investors and making money, I think it's not going to be as straight forward. And I think people who are, you have to pick the right vertical markets. If you maybe consider things like working on tools, working on things that may enable autonomous vehicle development, I, in my opinion, listening it's a more fruitful, pass the monetization or something, what we do, which is fairly direct pass the monetization, right, rather than barging in straight forward and trying to do on autonomous car, although some people will obviously succeeded that as well.


Tao Wang:

I'm Tao Wang and I have given a presentation just now. But I'll talk about myself a little more. I graduated from Stanford, having a master's in CS degree. And I actually went into the PhD program, but didn't finish because my colleague and I wanted to do a startup. The startup is called Drive.ai, and we are doing level four self-driving. So, I left Drive.ai a couple months ago, and I'm exploring new things to do right now. So, in terms of the future of self-driving, I actually agree with the guests that the self-driving will eventually become reality, but I think it's going to take a while. Right now, the AI is not sufficient to support level four, level five, ubiquitous self-driving everywhere. You might see some of the niche market deploying self-driving cars in the next few years, but large-scale deployments are just not there yet. I think we as an industry might need to take some of the more contrary methods to achieve self-driving, both in terms of technology and go to market strategy.


Alex Ren:

Okay, thank you. Thank you very much. And we talk about autonomous driving in the future transportation. And one question is how important is urban infrastructure? What is the role of the infrastructure? It might be easier for many Asian countries, but probably not for America, the U.S? And how do you see the combination of autonomous driving cars and the future infrastructure?


Louay Eldada:

Yeah, happy to talk about this. We actually are deploying Smart City infrastructure globally as we see it as a prerequisite to having fully autonomous vehicles, you need vehicle connectivity. And when you say that, that implies having an IoT infrastructure with 5g connectivity that will allow smart transport vehicles to communicate to each other, vehicles you can see around the corner. You can also control traffic flow when you have visibility for what's happening on the road. So, deploying the hardware and software that support smart cities is something that's necessary before we can see autonomous vehicles on the road. I just came back from China literally a couple of hours ago and just to show you the role that legislation can play, and regulation, there are self-driving lanes on the roads in China already. Smart City infrastructure is being built right now in China and 5g is being rolled out. So regardless of how highly we think of ourselves in Silicon Valley, I see China as leading the way here.


Nalin Gupta:

Well, what he just said totally makes a lot of point that all the smart infrastructure and V2X communication is definitely going to help. But, there are also a lot of other things that we can do in terms of infrastructure. For example, one of the reasons of doing this self-driving car is to give them the freedom of mobility back to people with disabilities and elder people. And in that sense, I think there's a lot of work that has to be done in making the infrastructure more accessible for people with disabilities. Even very basic things like the pavement linings, for example, you're not if you have to develop an AV which can be deployed across multiple geographies; you have to keep some standardization in how the lane markings are done. For example, there are four inches, six inches, different categorizations, and if we have a minimum reflectivity and minimum width, then it adds a lot of value. Also, like there are some companies who are working in putting QR codes and traffic signs and having special materials for lane markings, which can be read not only by humans, but also by smart cameras in otherwise very challenging conditions like rain or fog, or, you know, a very glaring situation. So, these type of things, we would also need to happen to make these autonomous vehicles which can work very nicely with human driven vehicles.


Ruslan Belkin:

It is difficult to disagree with what my colleagues say here. It's going to happen outside the United States. In the United States, we don't invest in infrastructure, and therefore nothing's going to happen in terms of smart cities here. And it could be an opportunity, because the reason you see so much advance coming out of Silicon Valley, including self-driving, is because our infrastructure is so bad. So, there is a silver lining here. But if I had to say where the opportunities are here, it is probably in parking. If you think of self-driving vehicles will be able to park not even fully autonomous vehicles. And parking space being the premium being able to stack the cars in the parking lot without human involvement, and making infrastructure within the parking structures for navigation, or around the parking structures from a restaurant too, I think that's where the first infrastructure opportunities is going to happen because it will enable private investment.


Tao Wang:

Yeah, I think I'm going to take a slightly different view. I think infrastructure is important, and it can really help self-driving. But on the other side, infrastructure is expensive, right? There's a reason why we sit here and talk about infrastructure, and nothing's happening out there. It's because nobody wants to put the money. And I think we really need to think about which part of the infrastructure gives us the biggest marginal improvements on self-driving. For example, if you try to pave the road a little better, does that actually help? Or I think right now, self-driving faces issues, not just on the roads, but more on the other participants on the road. And can you actually make the infrastructure such that other agents on the road are much more predictable than they are right now? Let's say if we can rule out jaywalkers in cities, with some infrastructure change, I don't know how, then the whole point of pedestrian detection becomes irrelevant, because nobody is going to just walk in front of the car, because it's just physically impossible for you to jaywalk. And you only cross bridges or designated areas where there's crosswalk, which can be mapped, and the car knows about it in advance. Another example is like, can you actually make the V2X on the traffic lights? Right? So, I think traffic light detection is, many of us think it's solved. But if you really want to solve it 200%, it's not solved today. And a platform camera detection, can you have another layer of redundancy that gives you that redundant signal such that you can at least compare your prediction with what the wireless is telling you. So, I think we really need to think about what's the highest leverage in terms of changing the infrastructure.


Alex Ren:

Okay. So, the next question is about the famous person Elon Musk once said, “anyone relying LiDAR is doomed”. That's the famous statement. However, Aurora just announced the acquisition of Blackmore for its LiDAR technology. So, what is your comment on those smart sensors? Do you think LiDAR is needed? Or, I will start from Louay?


Louay Eldada:

Yeah, I mean it's a comment that almost does not deserve a reply. Because what is the logic? Or, what are the physics that are being really presented as a reason for that statement? None. Zero. Because there are none. Right? So, Elon Musk is completely off; he was completely wrong with that statement. He made the statement three years ago, and I supported them when he said, LiDAR is the most capable sensor best in overkill. So, by saying an overkill, you're saying it's more capable than other sensors, which we know we agree with. But it's too expensive. Three years ago, that was the case, it was too expensive, when you had to pay tens of thousands of dollars for LiDAR, it doesn't make sense. When the LiDAR price is similar to the price of the car, obviously, it does not make sense. Today, for instance, we make solid state LiDAR at quantity based on silicon CMOS, and the price is between several hundred dollars to a few hundred dollars, depending on volume. So, the fundamental premise of his argument, which is too expensive for the job, is completely wrong today. So, he needs to stay up to date with the development and really listen to his own engineers.


Ruslan Belkin:

So, I think it depends what position you look at it. If you look at it from Elon’s point of view, I think there's merit to what he's saying. If you're doing a production car, that is mainly not fully autonomous. You have a driver there and the car has to look good. LiDAR is ugly, right? And especially if you have several of them, it's going to add the cost and you willing to live with the safety margin that perhaps not fool-proof. That's the business trade off. You can say, I'm going to try to do it without LiDAR. Obviously, I'll be disadvantaged at night, and maybe some other sensors will appear in the pictures. I'm personally excited about shortwave infrared sensors, right, that are coming. But if you're doing a robot taxi, do you really want a bus, or for example a truck, where actually looks doesn't matter? Costs doesn't matter as much on balance of the vehicle and the safety required, and it may not be a safety driver or safety requirement doesn't require you have a school bus. Would you want to have a school bus without all sensors possible? Probably not. So, it depends on how much risk you are willing to take. I think Elon is willing to take more risk than other people and you can see his point of view here. So that's good, a more nuanced answer, if you will.


Nalin Gupta:

I encourage people to take different approaches, and ultimately will come down to if statistically you can prove that your self-driving car is 100% or 200% safer than the human beings. The main arguments that are here against LiDAR are it's the cost and the moving parts in some cases. And the industry, to be honest, is making immense progress in both these fronts. The third type of argument, which I sometimes hear is that LiDARs are very poor in detecting or interpreting the behavior of a pedestrian who is standing on a curb side in comparison to a camera. Using a camera, you can detect whether he is just going to fidget on his mobile phone, or is going to cross over. And in that response, like we are, no one is saying that LiDAR is the only sensor which is going to be on the car, it's going to bet supplement sensor in addition to the cameras, so no one is going to remove cameras out of the picture. And I can actually easily give hundreds of examples where cameras are going to fail, like a sun which is glaring right on to the camera lens or a heavy rain situation. These are some scenarios where camera is definitely going to fail and that's the reason why even Tesla uses radar. But again, radars also have very poor cross detection capabilities, and they have a very limited vertical field of view, which was the reason of that infamous Tesla accident. So definitely, we need more sensors, besides just camera, eventually. Tesla is collecting a mass amount of data, and there are computer vision algorithms and deep learning and they are making immense progress in that. So maybe they might be able to break that barrier and maybe have algorithms to a stage where they can do everything using algorithms. But we are not there yet and we don't know when we will be there. So, if we have to deploy safe and reliable autonomous vehicles now or then in the next couple of years. I think we need more sensors than just camera and radars.


Tao Wang:

Yeah, I think I'm going to probably make a strong argument here that anyone who only uses cameras will not be able to remove that driver eventually. So, this is my own take. And the reason I think if you have seen my presentation, you probably have seen how deep-learning-based computer vision can fail in very funny ways. And with a reasonable patch of thing on yourself, you can actually appear invisible to a pedestrian detection algorithm. And this kind of thing scares me a little bit. If I think about a car without a driver, trying to detect all human beings around using this kind of technology, not that I'm against it, I think it's a very cool and great technology, and it gets us to the 99%. But 99% is just not enough for level four self-driving cars. The reason I think Tesla and other OEMs are able to use camera only systems is because right now the liability is still on the driver. When anything happens in these systems, even if it's in autopilot, the driver is still responsible for anything that happens. I think eventually we need some physical guarantee on detection of human or whatever object that is. And I think nowadays, LIDAR just gives us this guarantee on the 3D position detection, and maybe one day some other sensors will get there. I know some radar companies are working towards higher spatial resolution, maybe one day they will be as good as LiDAR. So, I can say LiDAR will definitely be the solution. But right now, I don't see another sensor that can be LiDAR in its own merits.


Louay Eldada:

If I may add, a LiDAR is the only 3D sensor, with the camera being 2D and with the radar be 1D. Then you add the fact that with the laser based system that where you have a minimally divergence laser beam, you have you maintain resolution, you maintain accuracy. And this you see in the dark. Look at one scenario, for instance, a car breaks down under a bridge at night, right? The radar cannot tell the car from the bridge, the car sees absolutely nothing in the dark. There you go. So, what do you do? Is that an acceptable situation? Do you want to have your kid in the car that's driving without a driver, and it runs into that situation, and you agree to do that, because statistically there's an advantage? Vehicles that are autonomous would have to be a lot more responsible than vehicles that have a driver. We cannot just say the fatality rate will drop. That's not acceptable. Because when software engineers that have all the time in the world to think about how do I design this algorithm, decide here is who I'm going kill when I have the situation, you have a big, big moral issue. Whereas when a driver does their best under the situation, and maybe still ends up hitting someone, that's more common and acceptable. So, the future has to be much, much safer, not just safer than the present state of things.


Alex Ren:

Okay. Thank you. So, for many L2 or L3 applications, in a modern recognized, driver behavior based or driving data is critical to improve fleet safety. And one of the issues is how can we actually convince drivers to trust the system? For me, I have some features in my car that provides me some aid, but I'm a little bit hesitant.


Nalin Gupta:

Well, I can give you an example. One of my colleagues, he was almost feeling drowsy, and with such an advanced system, it started beeping when he started falling asleep. And that feature literally saved his life. So, for these people, it will be very difficult to argue why they won't trust such a system.


Ruslan Belkin:

We actually encounter this problem frequently when we introduce our products in the commercial fleets. So initially drivers don't trust, or actually, they don't want to be watched. They don't want to be monitored in any way and they only, when something happens close to a million when people say oh, okay, well yeah, I actually didn't feel good. I was there and that basically saved me. So, it's going to simply naturally take time. And you're right, understanding driver behavior and understanding of the other actors’ behavior is going to become increasingly more and more important than if you see there's lots of research going on in around driver in 10 actors in 10 pedestrians and missing all these components are going to be very important to understand what people are going to do around you, especially as studying a mix of autonomous vehicles and non-autonomous vehicles.


Alex Ren:

I remember you guys just released kinds of a prevent when nature rise, can you explain a little bit about that.


Ruslan Belkin:

It was released earlier actually a while back. So, what it does is, it actually monitors the driver and son inside the cabin, including where is the driver looking directionally, is driver drowsy, is driver on the cell phone and actually also looks outside and says am I tailgating somebody, am I merging on the freeway. Am I in a situation that requires reminder of various things, not just visual, but also basically measure reaction time other things is that tolerance is enough for the driver current response conditions to react to the situation and be alert in different ways. Yeah, and that actually does reduce and we have actual in the insurance proof that it does reduce at fault collisions. That's how we make money as a matter of fact.


Alex Ren:

Okay, yeah. So, then another question to see about is as safety is a primary focus of all future transportations, how to promote safety without sacrificing innovative opportunities. And, what is your perspective on this? One side, you want to try something new, but you also need to consider about the safety issue.


Ruslan Belkin:

Oh, on one side I'd be long. On the other side? Look, I think one of you said it that while the liability on the driver today, I think you can get away with a lot of things. As liability shifts towards the company that owns the car fleet, or the vehicle manufacturers name, you're going to get into Boeing 737 situation, right. So it's going to be a lot more stringent certification requirements. So you have to work this balance, and that shift will start occurring. Do you want to be on the right side? And I think the opportunity is not only in improving software, but also in building software, for example, for certification. There's a lot of hidden opportunities here, and the things will present themselves as this transition starts happening.


Louay Eldada:

Yeah, I think the rollout of autonomous vehicles will be gradual not only in terms of levels two, three, four, five, but also of how you're allowed you actually you roll out fully autonomous vehicles. Even the most innovative companies, they don't say we have the most innovative hardware software solution, and we can take the driver out. What's being done today, and I've seen those, you have really control rooms where for instance, one maybe not too expensive person, instead of a driver, sits in front of multiple screens with 3 b in the average. And they take over when there's a disengagement. We are very far from getting to a disengagement rate of zero, where the car never says, I have no idea what to do here. That's going to take a very, very long time, possibly decades. So, till then you can have a single person that is looking at multiple screens, and can actually take over and make the decision for the car for when you still run into a scenario where a human can still make a better decision. And this is actually one of the many reasons why self-driving cars will not mean that there will be lots of people out of work. There are so many jobs related to rolling out self-driving car, not only on the innovation side, creating the hardware and software, but also the whole ecosystem of supporting the rollout in a very responsible way.


Nalin Gupta:

I agree with them. And frankly, this is not the first time I think we are seeing this tug of war between innovation and safety. Like one gentleman over there, he mentioned in one of the previous Q and A's about airline industry, right? It's the same case. There was a lot of discussion between how you know it, and at the same time, how do you balance out the safety in that aspect. And the example which lawyer gave about having a closed loop system, where even though you have a safety driver, but you have another person, whether it's a tally, or remote operator or a second passenger, and they're sitting on the passenger seat. That's really crucial. And if that would have been the case, the infamous Uber incident would have been avoided. A lot of those things can be done. GM super cruise is a very good example where they've rolled out an autopilot system. But unlike Tesla, they are doing it in a much more responsible way where they have an IR camera, which is tracking the state of the driver. So in that case, you cannot fool our driver, you cannot fool the car by just putting your hand on the steering wheel, but really, you are looking out the window. So those are the things where you can do it, where you balance out the safety and the innovation of it.


Alex Ren:

So, when you guys I believe were working with lots of OEM companies to provide this kind of ride service. So then what will be your way to manage risk, like you guys work with the insurance company?


Nalin Gupta:

Yeah, so we work with a lot of insurance companies to figure out. In fact, the largest insurance company Munich we were the first company to kick start the autonomous vehicle insurance program. And we work closely with them to figure out how statistically we can measure the risk of introducing these autonomy features in the cars.


Tao Wang:

I think the whole industry needs to answer one question that is, is the current testing procedure safe? So, all the self-driving car companies right now, how are we testing the self-driving cars? We are just putting our technology onto the road and test the system to its limit until it cannot handle any more. Then the human driver takes over. We call that one disengagement. And every year, all the companies reports these disengagement numbers to DMV. And I think this is not a very good way of testing because it's essentially using all the road users as lab rats for the technology that's not mature enough. So, I think this is analogous to car manufacturers rolling out their car without any testing onto the public road, and until they somehow kill someone, and then they try to fix this problem in their car, that's just not acceptable. So, I think the direction the industry should push to is more standardized tests, but in more closed or private environments, such that you can test a bunch of different scenarios and not jeopardize the complexity of these scenarios. But at the same time, you don't cause like unsafe conditions on public roads. This is something that everyone needs to think about.


Ruslan Belkin:

And then I want to add something to it because it's close to what we are trying to do as well. Indeed, this engagement rates is, like Facebook, or that type of approach to testing. There's nothing wrong with Facebook. But if you think of a car company, automotive vendors, there are simulators like various be one of them, with hardware in the loop that you could test that they use today for testing of the cars and the simulation software is just not very good. And the second problem there is really there's not enough realistic data for the simulation solving this that look just not fully working. So, I think there's an opportunity to acquire a lot of data from driving environments without taking over control to feed into simulation software, and to make certification actually much more realistic by using actually happened scenarios. And data that will come. If you look at the regulation, where does it normally originate? It usually originates in Europe. And if you can see the European agencies of working with stop regulation, you will absolutely have certification for AV software and in months Europe will come here.


Alex Ren:

So, Tao in your previous presentation, you mentioned the rest of the 10% of work might take forever. Right. So then, how do you actually solve the unlimited number of the corner cases. What will be the solution?


Tao Wang:

I don't have a solution right now. Really a multi trillion dollar question. And if you have the answer come to me and we can work out something together. But I think at a high level, what I'm envisioning is that maybe there's a way to capture all those scenarios with one unified method, okay, without seeing the examples. So, I think one example would be, if you have someone standing right in front of you, then your LiDAR should pick up that's something out there. I don't know if it's a person or a tree or a garbage can, but I don't care. My vehicle doesn't want to hit that thing. So that kind of guarantee based on physics and mathematics, I think that's going to be very important.


Nalin Gupta:

Well, yeah, it's a very difficult problem to solve. And for me personally, that's the whole excitement about working in this field, that you know it's still a very difficult problem. And every single day, I get to work on these very challenging problems. Though there might be, as Tao mentioned very briefly in the very beginning, there might be very niche applications, where the number of search corner cases might be very less compared to you know, the to L4 or L5 autonomy. And in those cases, I think it might be fairly low risk, to deploy autonomous vehicles in the beginning, one of the examples is, of course, the way we I started my own company, which is deploying in these control campus like environments, where the speed of the vehicle is very low, the pedestrian traffic is not that much, the speed of the other vehicles, or even the eagle vehicle or cell is not more than 10 or 15 miles per hour. So that's one of the other type of these low hanging fruits application says, empty car logistics, where let's say you have to a rental car is being delivered to your doorstep autonomously or the repositioning car from a cold demand area to a hot demand area autonomously with no passengers sitting inside it, or just for the delivery applications. For example, the neuro type of robots where these are really small delivery robots, to these type of applications are low hanging fruits, we have a number of corner cases of the risk might be potentially less.


Alex Ren:

Okay. So the last question before I open the floor. Since L4 or L5 autonomous driving is so hard, so then people choose a maybe more realistic way to monetize and commercialize this technology, right? So, then such as you know, you have a restricted scenario and environment, geo fences, and or maybe autonomous driving truck and maybe logistics. What is your opinion of for the possible commercialization?


Ruslan Belkin:

So, maybe I'll start then. Since we actually saw that in the beginning of the company, and I think they made the right call and decided, okay, we have to find a way to make money while this thing is going on here. So that's one task we do. I think there's other paths. If I'd say even more successful past here was labeling companies. I think if you think about who made money and all this gold rush, it's been to makers, right? And labeling companies caught the wind of if you do it, right. For all the labeling for autonomous use cases, this thing, that's another pass. I think certification, simulation, you don't see quite this number of simulation startups, but it kind of didn't get to a full doing the chain. But I think that there's a good opportunity in in those type of tools. I think that's be another pass to commercialization. I don't know what you guys think.


Tao Wang:

Yeah, I totally agree that when there's a gold rush, you just sell the shovels, and you can make a ton. But I think I also think this business model is only sustainable if there is actually gold there. So, after a while, if nobody gets gold, then you can’t sell the shovels anymore. Also, I think in the entire value chain, it's important to position yourself, your company as the part that captures most of the value. And I think, if there's not too much differentiation in the thing that you're doing, then you will be in a pretty bad spot. Because there will be like 20 other competitors doing the same thing. And at that point, you're just competing on price, right? So, what's the like core technology or access to market or access to customers, whatever it is, what's the core competitiveness of your company, that's also something to be considered.


Louay Eldada:

I think commercialization of some capabilities is happening already, right? Autonomous valet parking, traffic jam assist, and auto pilot on highways for long rides. And those also happens to be the situations where you get the most accidents. When you have a traffic jam, that's when people are texting that they're running late or something. When you're on a long, boring road, that is when you might fall asleep, or actually be tempted to actually do some work and so on. Many of the scenarios that caused lots of fatalities and accidents are actually the easiest one to address soon and address into and to address them in a very responsible way. So, commercialization is already happening. You might call all that basically, you know, partial level five because when it you know, I think full level five in any environment and Mumbai will happen beyond 2050 just to be make sure that you all realize that that we know I'm not saying that this is going happen soon but because people the day people in (Sorry, I'm picking on Mumbai now) today people in Mumbai stay in their lane, do not run red lights. You look right, left, why wait? And pedestrian actually only across when it says walk in pedestrian crossings. I don't have any visibility for that ever happening in my lifetime. So you have to think globally, because you're going to sell the same car globally and so on. However, with valet parking, the traffic jam assist and so on, all of that can be addressed with given the environment and given the limited capability needed that can be commercialized today globally.


Nalin Gupta:

I agree with him over here that such applications like the valet parking or traffic jam assist, highway driving, they can be commercialized today, or maybe in the near future compared to L4 and L5. Also, as I was mentioning earlier, the empty car logistics or the delivery applications. Again, some applications which can be enabled today. As far as the question of making money is concerned, I think the autonomous vehicle industry is going to disrupt so many other industries that making money is, is going to the least concern of anyone look like we can always come up with advertising based revenue models, content based revenue models or just making a moving office area just within a vehicle it's like making money out of this technology is not going to be an issue. I think the whole industry is right now trying to figure out how to make it more safer than the human driving.

Leadership, entrepreneurship, and AI insights

  • LinkedIn Social Icon
  • YouTube Social  Icon
  • Twitter Social Icon
  • Facebook Social Icon
  • Instagram Social Icon
  • SoundCloud Social Icon
  • Spotify Social Icon

© 2018 by Robin.ly. Provided by CrossCircles Inc.