In this Leadership Insights video interview, Amanda Razani speaks with Dijam Panigrahi about how AR/VR and AI/ML technology are working together to improve a multitude of industries.

 

Transcript Text

Amanda Razani: Hello, I’m Amanda Razani and I’m excited to be here today with Dijam Panigrahi. He is the co-founder and COO of GridRaster. How are you doing today?

Dijam Panigrahi: I’m doing great Amanda. Thanks for having me here.

Amanda Razani: Happy to have you on our show. So can you share a little bit about Grid Raster and the services that you provide?

Dijam Panigrahi: Absolutely. So before that, I’m Dijam, one of the co-founders of Grid Raster. So as a part of the company we are really looking to provide the capabilities and the technical platform required to take augmented reality, virtual reality, and AI at scale and primarily focusing on the industrial enterprises and leveraging all the good things that cloud brings in. So yeah.

Amanda Razani: Wonderful. Well these are all topics that our audience enjoys hearing about and I know augmented reality and virtual reality are big tools in the enterprise. Can you share how they’re being utilized in manufacturing?

Dijam Panigrahi: Absolutely. So when I think… AR, VR, we have been hearing it for quite a while, and I think this is a question that persistently comes across that, is it being utilized, right? So the answer is yes in a big way, and primarily its’ application we are seeing, which is scaling in a pretty nice way is around the training. There’s a big way augmented reality, virtual reality is being used for training purposes.
For maintenance operations, think of you being able to repair an aircraft which could be remotely there and you have an expert who could be sitting somewhere else and utilizes this technology to resolve issues much faster is huge. And there’s a big scaling that’s happening on those deployments. We are also beginning to see the use of those use cases even in the operational domain. For example, you want to inspect the Q&A during a repair process or detect where the defects are, which usually the humans have to visually see and detect the defects. But with the technology convergence that we are seeing with AR, VR, and AI, now you’re able to detect defects, do those defect mapping and visualize it contextually on the HMDs, and be able to expedite this whole efficiency and the time it takes to resolve. There’s a huge improvement that we are seeing across the board.

Amanda Razani: Can you share what are some of those improvements or the final outcomes? Where are there big differences that can be seen?

Dijam Panigrahi: Absolutely. So training. For example, one of the big defense prime that we are working with, they were able to train… So basically we are seeing there is a shortage of labor across the board in manufacturing, and one of the big challenges, can you upskill somebody quickly than compared to what we have been doing in past? Because now that’s one of the effective ways to narrow the gap that you’re seeing there, right?
So what’s happening is using a headset like a hololens, you are able to take somebody who is maybe one year into the job, be able to outperform somebody who has been five years doing that, just because now you have a hololens in which you can overlay the instruction what needs to be done. You’re removing the guesswork from the entire operational aspect, giving the confidence to the user that what you’re doing is correct, you’re not missing any steps.
So all that is a big boost, and what we are able to see is even training in those environments has been 10 times… what I say, the retention using such medium has been 90% of what you do compared to a normal video-based training environment. So there’s a huge, what I say improvement that you’re seeing on the training side.
Again on the operational side that when I were talking about somebody who is performing at 1-year-old and he’s one year into the job, and you’re looking at somebody who is five year into the job, so there’s an expert, there’s a novice,
And in fact in that defense prime environment, the novice was able to outperform somebody who was an expert by utilizing that device. I mean that’s the kind of impact you’re seeing using this medium.
And then you have on the maintenance side, during the maintenance, by providing those contextual information while you’re maintaining an aircraft, you are able to bring down the time it takes to repair or maintain something almost by 50%. It may vary from 30 to even we have seen 80% improvements, but on an average basis, on the multiple engagements that we are in, we are able to see, you are absolutely getting somewhere around 50% improvement in the maintenance time.
And the beauty about this technology is this, because this is something that you’re constantly seeing, and the camera with the AI you can completely eliminate. You can do a Q&A while you are actually doing the job, ensuring that you completely eliminate any human error, which usually was one of the biggest thing that you cannot avoid. And this has been fantastic. And overall as a technology, not only you are able to improve the quality of the work, you are able to do it cheaper and you are able to do it faster. It’s rarely it has happened that you are able to address this, the triangle in such a way.

Amanda Razani: So higher quality, cost savings and in a more efficient manner. And I would imagine does this contribute to safety as well?

Dijam Panigrahi: Big way. Just think of this, you are able to train people in any kind of situations. If you have to train somebody in a real world you would’ve really… There are what I say, even the reactive chemical leakage and all, you still train people in those environments. But now you have to train them almost in that real world without the risk of the real chemicals. So you’re able to not only train them in a much more safer environment, but you are also able to actually give that visceral feeling as if they’re actually going and doing something and getting them much more better prepared for such scenarios.

Amanda Razani: That’s wonderful. So you mentioned AI a few times, and how can companies incorporate AI and the machine learning into this process?

Dijam Panigrahi: So, I know the AI is everywhere right now, but the fact is AI as a standalone, it’s not something that’s too meaningful, but the whole idea is AI is going to be everywhere. It’s going to be there in each process. Forget about even application, it’s going to be there in everywhere, in the process of within that application itself. So some of the conventions I can talk about, some of the use cases that we are working on.
As I was saying, the first use case is a lot of this, what I say, the inspection work that happens during the manufacturing process. Let’s suppose you get a big, one of the largest aircraft manufacturer, they get the fuselage built by one of the third party vendors. And the fuselage comes to their facility, but they need to see that everything is in place, the brackets, the fasteners and everything is in place where it should be, according to the design that is there.
Today most of the inspection process happens using mostly visually. You basically take the design and then you identify where the gaps are. You make a marker over there. Now with the help of the tools like AR, VR, which provides you the camera feed of what you’re seeing, and then at the backend you’re bringing in the AI to be able to detect… okay, this is my design, what it was in 3D, this is what I get. And that using AI you can really figure out what the difference. Is there any deviation from the ideal design, right? So that’s one of the use case in terms of detecting the anomalies, detecting the defects, moving away from visual detection of those defects to a more automatic detection of defects, and that only you’re able to do because you’re able to bring in AI into that, right?
Then even in training process. So during a training, let’s suppose we have been working with the Department of Defense in many of those use cases. So you are doing a mission rehearsal of a certain offensive or a defensive mission. In that you have… When you really run into those kind of mission environments, it is not only you alone who are there, right? You are really working as a part of a mission… there is a pilot, there are the support crews and everybody, those every simulation that is done. But beyond that you have all the opposing forces. So you don’t really have that in the real world, but in the virtual you can create those opposing forces and a lot of the AI agents that we basically utilize to create those environments, and train it and based on the changes in the tactics or what you want to simulate, that’s where a lot of this AI is being utilized as well.
And also the AI is also being used to do some of the predictive rendering. For example, what we do at GridRaster, we are able to stream the experiences onto the device diagnostic of what the capacity of the device is. And a lot of this we are able to do because using AI we can basically predict, okay, how the user is going to use the sensor detail and all that and be able to predict how the user is going to behave, and do certain things ahead of the time, which allows us to give a much better experience to the user. So as I was saying, AI is going to be everywhere across the board, right from the capturing of the data to processing of the data and also consumption of the data or providing that to the user. So that’s what the AI is being utilized right now.

Amanda Razani: And it really does seem like a far-reaching tool that is being introduced into almost every industry. There seems to be so many use cases.

Dijam Panigrahi: Absolutely. Limitless. Wherever I think there is data and we are getting more and more data, and there is insights that are available that you can take advantage of it, AI is going to be fantastic.

Amanda Razani: So what advice do you have for companies that are just now trying to integrate any of these tools? Where is a good starting point for them?

Dijam Panigrahi: I think the biggest starting point that we always say to everybody, start somewhere, you’re going to see a value. We have run into a situation where we have enthusiastic innovation, person who would say, “Okay, I want to do VR here, AR here, here, here. Everywhere we want to bring it.” But usually that’s becomes a trap because I mean, not that it’s an answer to all the problems that are there, but the important thing is you identify, okay, here is one of my biggest problematic area, and within that you select a specific component that you think, if I solve for it, it’s going to give the maximum value, and just double down on that. Prove the value on one which is low hanging, where you can convince it’s easier conviction to the larger people who are involved, the leadership that is involved. Start there, start small, but be assured that there is value at the end of the day if you prove it. I think that’s the best starting point I can tell anybody, and we repeat that across the board in most of our conversations.

Amanda Razani: From your experience, where are some of the frustrations or roadblocks during the implementation, and what are your solutions?

Dijam Panigrahi: So there are two aspects to it. One is the technical, one is the non-technical. The non-technical is more around the leadership support for such a technology which comes from the understanding of the technology. And so we run into those challenges quite a bit as well. And that’s where the first part, as I say, is start where the value is most. It’s like, you take a certain risk but returns are so high that it’s worth the attempt. Once you get them bored with something which is of value, then it becomes an easier conversation with something else which may have a less value but it’s still valuable. So that’s one.
On the technical side, obviously a lot of the companies are struggling to scale that first in terms of the availability of the content, and that’s why you see there’s a big first attempt across the board that’s happening around the digital twins. Like you’re recreating your physical environments in the virtual space so that now you can take advantage of this techniques and the tools that are available to help you with that whole productivity and efficiency and those aspects. So that’s one thing which is there.
The other thing is obviously on the device side, which is obviously not in control of us. So one of the things that is there, obviously you ideally want those devices to be pretty easy to wear, and ideally it should be somewhere like a specs and you wear this and the compute comes from there. It just works. We are still yet to reach those point in technology innovation where it’ll become at that scale. But having said that, we are in that direction. Some of the capabilities that we have, the ability that we… As I was saying that we basically take away the need for depending on the device because we do the heavy lifting on the clouds, which allows you to do very minimal stuff on the device. That way it allows you to not have too much of compute on the device. All the battery on the device and that’s why the devices will become sleeker and sleeker and that’s the way all the development that is going on on the AR side.
So some of that directionally we’re working on. But having said that, the best thing is… We even, with all that that is there, the constraints which are there, there is, if you identify the right use cases, even with those constraints, it is a fantastic use case. That’s a huge value that exists and basically that’s what you need to navigate through.

Amanda Razani: With this rapidly, as these technologies are advancing, especially AI, that’s what we hear about a lot these days, where do you foresee the enterprise in a year from now?

Dijam Panigrahi: I think the challenges within the enterprise that need to understand is, understanding how they need to use AI. I think the challenge is more around making those decisions, because we run into issues where obviously they don’t want to use AI because they have all that, the data privacy, the sensitivity of the data. There’s a lot of bottlenecks which needs what I say, a leadership conviction, or the will that the benefit outweighs the possible risks that comes with it. And obviously there is work happening on mitigating those risks. So we basically look at that. But yeah, having said that, you are looking at a time which is absolutely going to change the way, how the enterprises will work as we go forward with the robotics, AI, AR, VR. You’re going to see a beautiful convergence of all this technology, and more and more you’re going to see that repetative tasks more and more will go get automated.
You’re going to see that a lot of this contextual information, which was almost like what I say, it’s overweight for a lot of users to figure out all that together. AI will be amazing helping piece, like assistant to you, helping out throughout the process, and then making your task… You focus on things which humans are good at, and then use AI as your assistant to help you accomplish it. So we are basically going in that direction and you will see just maybe not right now, maybe not in one year, maybe five years down the line, anything that happens in the real world has already happened in the simulated world, before even a single thing that’s happens in the real world. That’s how it is going to be.

Amanda Razani: Interesting. It’s very exciting to see how the future will unfold. If there was one key takeaway you want to leave our audience with today, what is that?

Dijam Panigrahi: The only thing I would say don’t resist the technology. The idea is try it out before you make a judgment on anything. It’s start small, start assured and try it out, and iterate and move forward. There is huge value, but the value may differ for a different organization, that you can only know by trying out in your setup.

Amanda Razani: Thank you Dijam for coming on our show and sharing your insights with us.

Dijam Panigrahi: Thanks Amanda. The pleasure is all mine.