Close Icon

NTLF Conversations: The Indian AI Startup Story

In this world of rapid technological evolution, sharing insights and collaborative learning are crucial to success. In this episode of NTLF conversation, Debjani Ghosh (Ex-President of Nasscom), Abhinav Aggarwal (Co-founder of Fluid AI), and Kumar Rangarajan (Co-founder of Slang Labs) come together to discuss the nuances of building and scaling up an AI startup. Dive into the collaborative spirit of innovation and discover the shared practices that pave the way for groundbreaking achievements in AI.

Transcript Disclaimer: This transcript has been generated using automated tools and reviewed by a human. However, some errors may still be present. For complete accuracy, please refer to the original audio.


00:00:37 DEBJANI GHOSH: So today I am joined by two amazing founders. AI founders, as they are called to talk about their journey and what are some of the lessons that they have learned along the way? And what are some of the learnings that founders who want to get into the space, who want to build sustainable businesses in the in the in the space of AI can learn from. So really looking forward to an excellent conversation. So thank you for joining me, Abhinav Aggarwal, Co-founder of Fluid AI and Kumar Rangarajan, who is the Co founder of Slang Labs, right? So wonderful having you both. And I will let you all talk a little bit about what you all do and introduce yourselves. So I'll start with you, Kumar, if you could talk a little bit about Slang Labs and what is the big problem that you all are solving? Cool.

00:01:30 KUMAR RANGARAJAN: Thank you. Thank you again for inviting us in for this talk. I'm Kumar Co-founder at slang gaps at Slang Labs. What we are building is the world's first AI assistant as a service platform. So we we all saw the magical demos. A few weeks ago of AI assistants, how then potentially might be changing and transforming the way things are open, but also a general purpose AI assistant, whereas a business you have your app, you how do you know bring in that kind of an experience into your app and make that experience available to your customers so that they can now drive value out of that. So you're building that platform where it's an AI assistant as a service platform. So we are working with a bunch of customers like Tata Neu Capital and Mama and Milo to be able to bring in this experience like so. Now we are launching a platform called Magic Studio, which makes this AI assistant available to any application developer to be able to add it to their application.

00:02:21 DEBJANI GHOSH: Brilliant, brilliant.

00:02:22 KUMAR RANGARAJAN: To be a data scientist, you don't need to be an AI scientist or Android developer or iOS developer, web developer. Now you want to be able to bring in this kind of a magical experience. So that's what we're doing. It has to be very reliable. That's actually one of the biggest channel. Everyone tries to experiment with the AI, but nobody actually goes live because of the reliability. How do you know bring in reliability? That's the second part. The third part, like typically when you think of AI in an application, think of it as a chat word. It's a customer catcher word. But AI is not just customer catch up. The assistant is like much.

00:02:50 DEBJANI GHOSH: Say that a bit more loudly and clearly. I need people to understand that it has to.

00:02:55 KUMAR RANGARAJAN: Go beyond chat words, right? Like how do you know transform the experience of various, especially if you're focusing on consumer app experience, how do you transform those things to be much forgiven? So that is what, you know, naturally we are doing as part of that, that's it's going to be launched soon on Magic Studio.

00:03:10 DEBJANI GHOSH: And what's Convoy AI?

00:03:12 KUMAR RANGARAJAN: So convoy is a larger platform. Convoy AI, is a platform that you are you are building in and adding into your application and the product that we are building now to make this Convoy AI available to any developer. That's this new thing called Magic Studio.

00:03:25 DEBJANI GHOSH: You know, I strongly believe that you cannot not believe these assistants and Co pilots or whatever you're calling it separately and then have the user figure out how do I integrate it. I mean, that is a nightmare, right? These have to get built into the UI and the the apps itself. So, so and that's a great learning by the way. So I will say that's learning number one for anyone who's thinking about starting off in this space. But that's brilliant. Abhinav, a little bit about Fluid AI and all the wonderful work you're doing.

00:04:03 ABHINAV AGGARWAL: Problem we are solving is essentially the biggest challenge most of these large organizer mid size also have is this a very document intensive large amounts of documentation data dying everywhere and just getting the right answers to their question and then automating a lot of these processes is it's killing them. Yeah, yeah. When they do ask a question and nobody knows the answer, asking 50 people going all over the place, it's like a disaster. And that's true for everyone, even for ourselves, you know, datas like text data and other datas is crazy. So we willing and we have an enterprise GPT product at Fluidia. So you would deploy it in an enterprise. You can use it for multiple use cases, but just ingest all the documentation, all the other systems that connects up, then provides an AI assistant for your company and then you could use it. That we have large manufacturing companies using it like a GPT enterprise GPT for their plants and their manufacturing teams at one GPT solve all my problems and how to fix this machine. When did these machines go down last? You know all of that you have on the banking, finance, insurance, where they're using it for, you know, shared service, operations improvement, automation of even support even for like sales creation, you know all of that. And and then you have pharma, life sciences, again, similar challenges that we're seeing across all. So how do you meet the organization more productive, more efficient and just deliver more value from all the stuff that they?

00:05:30 DEBJANI GHOSH: Yeah, that. No, absolutely. And we'll talk a little bit more about some of the specific problems that you're solving. I think it's brilliant. You know you both are way beyond the POC face, right? You all have botany customers, you all are scaling to a pretty large extent with your customers. What are some of the challenges that you have faced or you are facing as you move from POC, from prototype to actual scale? So.

00:06:03 KUMAR RANGARAJAN: I think next to the things that happens in challenge from POC to year especially so when customers now lot of customers interested in dry POC.

00:06:11 DEBJANI GHOSH: Yeah, that is yeah, everybody and free POC which has to stop if you are from the industry that has to stop.

00:06:21 KUMAR RANGARAJAN: Lucky they would not get customers willing to pay for it so the value but The thing is like still they willing to do the Piers now the thing that stops them is to be able to go to the next page. So it's like, what does it library teacher like because they're always worried about the because this fear like OK, when I'm doing in my lab because it's an ungrounded, unbounded sort of system, this fear of ballast Nation fear of.

00:06:41 DEBJANI GHOSH: Reputational race.

00:06:42 KUMAR RANGARAJAN: Like this. So all of that starts to come in. So that becomes a key factor for the things moving from POC to the next phase.

00:06:50 KUMAR RANGARAJAN: And so that's actually you're not factor right from beginning like how do you now make sure the system is not reliable? How do you ensure that there is a mechanism in place which is now going to be human augmented but not human only again, Yeah, dependent. And it's not going to be able to help solve that problem. So we have something called eval which solves always. So those kind of thing, but it's a standard. Second thing is the form of cost, right? So now while the cost is dropping, it still is quite large. From POC perspective, it's fine. It seems OK, but now it's in the start. I think I'm going to scale it up to millions of my customers. If everyone starts using it, what happens the what is the cost per usage of that system? Now that becomes a very prohibitive thought process for people, right? So that, so the way to be able to challenge that again is to be able to know from an engineering perspective what you saw from business perspective, because the ROI expected from this is like for example, I'll tell you an example of what we're doing. One of the use cases on which we are working on. So we have an horizontal platform, we actually have a professional service goes and solves some verticalized .1 of them is the area of grocery e-commerce. If you see a classical application of e-commerce, whether you're doing electronics e-commerce app or a grocery e-commerce app, the experience is exactly the same. But in a grocery application, you are not buying one item, you're buying an average 17 to 20 items in your cart. Yeah, but electronics are like you buy 1-2 items, you spend a lot of time free of that one or two item, but you only add one it. But if you look at the overall experience is exact. How do you bring in this EI assistant experience on top of a grocery app, for example? How do you now solve that to make sure customers are able to buy it very quickly add which is net go into the number of hard cart size items to do all those things. Similarly from the electronics are solid and very different there. The need for a customer to actually want a sales guy sort of experience in the app because they don't know what they want. They're exploring, they're finding it out. So, you know, bring in that experience in your so, so the to be able to now get these guys, get businesses to be able to now realize the potential ROI becomes a second charge because there's a huge cost factor that is associated with Indian businesses especially tend to be very extremely cost function. Again, comes back to the same story. It they want to be able to see mind, do you have a guaranteed ROI proof of but for that day and we have to be able to experiment out there. So now that experimentation mindset is a little slower. So that is again, that is something you have to be able to now work with because things tend to take a longer time to be able to now get that out because the others try it out to see the value, to try the value, they take a longer time. So that slightly starts to take longer. And so you need to be able to not be a little patient to be able to not handle.

00:09:26 DEBJANI GHOSH: Yeah. You work with a lot of global banks, right? So all the largest names you are working with, Is it any different?

00:09:35 ABHINAV AGGARWAL: No, I think to what Kumar says. Exactly.

00:09:38 DEBJANI GHOSH: So it's not just Indian companies.

00:09:41 ABHINAV AGGARWAL: You know, I don't know, the largest automaker, India abroad to pharma, to banking, it's I think there are two key failures. Everyone's afraid of failure and everyone's afraid of change, right? And larger organisations are the epitome of fear of failure and fear of change. And yeah, I need these opposite. Absolutely. He needs to be like, OK, rapidly, like he said, right, Prototype rapidly. In fact, in the Stanford course, the the biggest thing they are teaching us right now, you know, design thinking, which is rapidly prototype. But can you prototype?

00:10:15 ABHINAV AGGARWAL: Orgs come in and say OK we prototype for like 6 months and you're like six months is like. But even if prototypes had you and it comes with very fast and some of them are like, I think they also have their constraints. But if they can get over that fear of failure, like let's just try things. So try 6 things. 3 will work will be OK. Another one will be like really bad. I think that will really move.

00:10:41 DEBJANI GHOSH: That I think both of you are bringing up a very, very important point, right, which is the, the, I don't know what to call it, but I guess it's sort of a change mindset, which really is needed when companies decide to start their transformation journey with AI or any other technology. What they have done in the past and how they have been successful in the past is not going to cut it. If anything, that's, that's a guarantee they will fail if they follow those same processes, right, where everything is tested, tried and you're 100% safe. And I think that's, that's a transformation that companies will have to go through and that such an I mean, startups are at the front end of it. You guys are in that way, you all are starting off with that mindset. And I think that is a big, big, big requirement. But let us talk about some of the other challenges, right? A lot of the early stage founders that I am speaking to, the biggest complaint is funds, right? Most of deep tech funding in in India Today is coming from government funding, these grants that government are giving out. And while DST is giving out a few big grants, most of the other departments are giving out. I mean lot of grants but like small amounts, 25 lakhs, 50,00,000, etcetera, etcetera. That is just not enough to build your AI solutions. So what from your experience is, is the patient capital a real issue in India or the lack of it a real issue in India? And what is your advice to founders?

00:12:22 KUMAR RANGARAJAN: I think the others, unfortunately, because the underlying tech is a fairly expensive. So like, you know, I was named when we initiate, OK, we, we were always in an AI company, but we were think not that today's pre genuine, it's pre genuine company. The old school AI is what we started with. So where we thought, OK, we will run our own models. We will figure out ways to be able to know optimism. But then you really think that is not the way, especially today with genuine is not the way you're going to be able to know skill. You have to be able to do things.

00:12:48 KUMAR RANGARAJAN: So one of the hacks that I can, of course, you ought to be able to look for funny, but you're always going to get into a cash value situation. How do you raise funds? One thing I would recommend startups to exploit is to be able to now use the credits from the cloud providers and make yourself as much as possible cloud agnostic. So you can now try to have some Google, you can use some Amazon credits from Microsoft and keep switching between these two, these 3 to be able to until you find your fit. And each of them gives you fairly good. Yeah, Yeah. I think we'll be able to now get you to go through that, that initial hump to be able to because investors again were expected to have shown some proof of the party before they actually put in some unless you have a reputation. Yeah, yeah. So this is one thing that you could be able to now use to be able to now get that. And I think and these cloud providers are looking for innovative use cases on their application. So I think they would also.

00:13:39 DEBJANI GHOSH: No, that's a great hack and we'll come back to it a bit. Abhinav, your feedback on this, your inputs.

00:13:45 ABHINAV AGGARWAL: I think so. You know, we were a different I, we were bootstrapped all the way over the last 12 years. We had capital to throw out the problem. But I think there are two constraints why you need capital right in a deep tech. One is the hardware, which is really expensive, and then is the talent.

00:14:01 DEBJANI GHOSH: Right, both expensive.

00:14:02 ABHINAV AGGARWAL: Both are expensive so like you can play around with these constraints a bit. Like Kumar gave a great hack right to go to each of the three. Hope they don't find out.

00:14:13 DEBJANI GHOSH: I'm sure they know.

00:14:15 ABHINAV AGGARWAL: And they kind of take credit. I think that's where government has a big role on the hardware this side. Yeah, companies like the big daddies have that role. If they can really enable that hardware and try to be available at scale for better start, I think that could be very powerful. Talent can be solved innovatively. So there are innovation, you know, the way you train people kind of, you know, do smart hiring. Yeah. But capital is more patient when it's strategic is working.

00:14:44 DEBJANI GHOSH: Amen. Yeah.

00:14:45 ABHINAV AGGARWAL: So I would say try and align more the strategic folks that are looking at AI from a strategic perspective and a long term vision, maybe bring it into their own arms in some way rather than the, I mean, the others are also nice, right? They they have to ramp up, go to market really fast, but they will definitely have their fund enter. Yeah, they need to work. I think that's where we've seen that probably a strategic helps to be more patient if your business is that kind which needs that patience.

00:15:13 DEBJANI GHOSH: And if you can figure out the problem that you're going to solve and figure out how to communicate it effectively so that the other party understands, I completely agree with you. I think growing deep into a vertical and then really building for problems in that vertical is a better way to find the support that is needed than sort of a generic, generic solution which you take usually to VC. So investors and expect the money. But you know, on your on your role of government. I think I am so glad that the government has now taken the commitment, you know, taken the call to come out with India AI mission. It is a commitment that India has made and one of the key things I do hope that and I am I am very confident they will solve for in that is the compute access and cost, right And and there are different models they are exploring. But I mean, one of my key asks there is why you are figuring out how to build democratized compute in the country till then figure out a way to at least subsidized the startups and academia that are actually doing the work of building a lot of the AI solutions. So I think we will see at least some support coming pretty soon. But I also think in today's world of model perishability, you also have to be model agnostic when you are building your solutions. And so many founders that I speak to are completely lost or complete. You know, they are building, then the life revolves around 1 model and they are building on that model. And what they are not thinking about is what happens. And for sure, whether it is 6 months or one year, the next model is going to come out. It is not only going to be bigger and better, it is going to be cheaper, right? And I am from the semiconductor industry, so I am very well aware of Moore's Law and how it plays. And this is Moore's law on steroids, right? What we are seeing here. So how do you build the the the ability or a business that is model agnostic and that and as Sam Altman said, that actually bets on models getting better rather than not getting better, right. So how you doing this Amanav?

00:17:32 ABHINAV AGGARWAL: I think it is a, it is back to the original thing, right? I think the minute you put the the tech right, the model and the solution in the frontage end of your strategy, it kind of destroys your ability to think for the problem. So you need to in fact think, OK, I'm solving problem X, what is the best tech? And that could be the best model, the best approach, whether it's multimodal, single model, that all comes in the back burner. So you almost say like that when AI is nothing, that's when it's everything. When the AI goes at the back and it it's there, it's invisible. That's invisible adjectives. I'm solving X whatever gets me to X faster. It's AGBT 4 O Omni that because my use case needed like that, that voice and and image ability, then use it. Or if it's a Zephyr or if it's an export solution, you know, then use it. So I think it's just the way we also look at it. We solve for the problem and then we use whatever the best at the back and that keeps us honest as well. It prevents that bias getting married to a tech. Start tech discussion then even I would go like oh this is much better than this and now let's bet in on.

00:18:47 DEBJANI GHOSH: You know, I know you will, I spoke to you a few months back. I think that that point, you were building most of your solutions on Llama 2, if I'm not wrong, right. And then the moment Llama three, I think within a day, you were the first one to call me to say, have you noticed the changes And that's how fast you had moved. That's, and I think you sort of immediately sort of. And so I mean, is that how you think about it too? And any hacks that you can leave founders with?

00:19:16 KUMAR RANGARAJAN: So like this just remains me of a mean, which I saw that is this bell curve of developers working on Yeah, this vast majority are optimizing for the model by writing different prompts and solving it like or do this that to mixed up whatever problems these two ends, the outlier ends. I'm just waiting for a model upgrade. You don't denied any more new prompts, the model upgrades and then. So that's how the things are now operating. So that's to be cognizant of that particular. So the there is challenges that you come in like for example, when we started like we were initially thinking we're going to be basing it on Lama two, we're going to fine-tuned on top of LLAMA 2 and then we release like that's a full ID. So don't do it. So instead pick the best, easiest one to be able to move forward. Pick open AGBD, build your solutions out of it. But you are absolutely agnostic to it. You are not married to your agnostic. Once you reach a certain scale, then you want to be able to now fine tune your model using because you only have a lot of data that is there. Now you can take whatever is the latest best open source model to be able to now fine tune it on top of that and to be able to now use now when you are the process in which you're fine tuning again, the one step back, right in the earlier era, the pre genuine era when we used to quantify, used to call as a model everyone very application specific, very localized. So the ability to move from one model to other model is very hard speed. But in the genuine era, the what has made it magical is this corner person working on the exact same architecture. So your application has become. It's like the X86 chip now.

00:21:02 KUMAR RANGARAJAN: Application specific chips, now it's all become X86 chips. So now we can all just work. We can just wait for the two 86386 Pentium, the Zeons to keep coming and your application just becoming faster and faster right now. This is if you start abstracting it like this, it starts to become much better. Move to ND, move to Intel, it doesn't matter. But if you are putting it on the X8 as 6 architecture and then risk comes in and then then that the whole thing.

00:21:24 DEBJANI GHOSH: Brilliant, brilliant, brilliant. And now that I think your your suggestion there or your advice there is is a very good one that prototype on closed source as much as you can and then figure out how to scale using open source which is you agree with that.

00:21:42 ABHINAV AGGARWAL: Well, I mean, it depends if you are, if you are looking at like testing something at very, very low volumes, yeah, then I think it's definitely the right of book. But sometimes I feel that, you know, the the behaviour of these models are a bit different. So if your team, yeah, you tell contrarian to that. Also, every team gets used to the the open ones. Right. There's a lot of science behind them. Also when you fine tune them and you host them and you quantize them.

00:22:11 KUMAR RANGARAJAN: You get the most performance out.

00:22:12 ABHINAV AGGARWAL: Kind of abstracted away with opening. So if your problem that you're solving I think doesn't like, you know, the LLM is not, it's a component, but it's not the most critical, right, then I think that's the right approach. But sometimes I feel that the LLM is very, very important, so you might as well go ahead.

00:22:31 DEBJANI GHOSH: So basic message is be agnostic, right? Don't be wedded. Models are perishable. And I think it's very important that founders understand that there's no debate about that, right? And I guess how you do it, there are alternative ways, which is great, right? I mean, you have to figure out what's best for you based on what, what's the problem you're trying to solve. I think that's that's yeah, yeah, please.

00:22:58 KUMAR RANGARAJAN: So like because like being when you use open source mode, you know when you start the open source mode, That's how the engineering mindset and also said, OK, let's start the opening model. We fine tune our own solution and then start using it. But then we need a Mac calculation to be able to now to be now you need your own GPU. You need to be able to host GPU cost per transaction. Then it starts to become compatible to an open at a close hostel is 1 even treat us like a million plus skip. If you have confident of that hitting that scale very, very early, then it makes sense because you know that scale is just coming. It's just a matter of like, you know, week or two or month. I'm going to get that one. So I better be prepared. But if you are betting and if you are trying to OK, you don't know, you're also experimenting on that part. So then it's best like to be able to take the manner that is the leverage, the advantage that because GPT is clearly subsidizing the cost. They are every subsidizing the cost of the rest opposite because they are their actual cost tool. Execute a particular transaction is actually much more than the cost.

00:24:00 DEBJANI GHOSH: And I thought the transactional cost was cheaper for open source. I'm wrong?

00:24:04 KUMAR RANGARAJAN: Look, the transactional cost is cheaper for the portals only after a certain scale. Like because like for example, you need to be able to how like you the starting thing is your own GPU. If I go to installing your own GPU now the cost to running your own GPU, it starts to get to a few $1000 per month, you have to take $1000 hit. It will only come in when you cross a certain millions transactional like the number of queries you make has to cross this minimum 1. So we did the initial math before we said, OK, this is by engineering very interesting. We did the thing and then we said once we we'll have it ready, but we'll flip it when the there's a data start to come.

00:24:39 ABHINAV AGGARWAL: There's a approach to that. Basically, if you run GPU containers on the cloud and you turn them on and off, it gets very cheap because essentially what we found a lot of our teams, what they were doing is they were running these runs like eval runs, right? Like you have 1000 Python questions set and you want to test it right? You run it like 20 times a day, 100 times a day or small change to the algorithm and then run the whole eval. So then you just spin up a GPU container and then it even if it's batching a bit, it's OK. So the cost actually.

00:25:12 KUMAR RANGARAJAN: Comes out from a training perspective completed. It like, yeah, OK, I've got you don't really need themselves in the next no even eval. Even if you want inference, you're talking about one value by inference real time.

00:25:22 ABHINAV AGGARWAL: Now that do they let you host the model but you'll pay per GPU second. So you essentially abstracted the need to get the GPU or GPU and because of being GPU per second then the margin of opening eye gets wiped.

00:25:40 DEBJANI GHOSH: This is turning into a MasterCard is fantastic.

00:25:47 KUMAR RANGARAJAN: It costing much for us but I think from you like what are the tricks this.

00:25:51 ABHINAV AGGARWAL: I think both works. I think for us actually the the underlying model, tuning it or playing around with it for our use cases matters a lot what you are trying to solve. Actually your approach is correct for this all that you get.

00:26:02 DEBJANI GHOSH: So let's talk since we are talking about cost, let's also talk about where do you make money, right. So there is, you know a lot of VCs are today saying you models has absolutely no monetization value, perishability etcetera, etcetera. The value lies in applications, right? It is not the infrastructure. What do you guys believe and where do you see the values?

00:26:29 KUMAR RANGARAJAN: In my mind, I think it's it's the combination. Like when you see infrastructure, there are multiple layers of it. Yeah, the lowest is the model. Yeah, Now the lowest is the model. And still you can see the most. I think today the number one AI money making company is the lowest money company model. Open AI, the one that makes the most money. The other guys are making money are the consultants who are going it is companies and then saying like, OK, now how to use it, But all these SAS companies in the middle actually struggling and finding it hard to be able to do that, right? So because like that's the challenge because like you have to actually work with the customer, customer on the free POC. How do you know handle it? But underlying guys don't give you the so, so that is value. So with one of the lowest level mode, the next level models of the infrastructural side are things like what do customers want? The things that has challenges today is like you want, how do you make it faster? What are the things that you can do to normally are the things that you can do to normally make it faster, like simple things like, for example, a cache related, you know, that's not something that's ever going to come at a modeler. You Audi and you don't the traditional greedish cache, not the model that you can use today. It's a semantic cache. The Audi now building a semantic cache layer. The second thing, I already build a reliability layer like because people are wanting to make sure there's no Alice mission. How do you know ensure reliability? How do you ensure privacy? No, those things again, won't be handled the model because model, whatever you say is never going to solve that problem at its completion. So these infrastructure layers are still something that is going to be very valuable, that's going to cut across. The challenge is going to be a lot of people are going to be playing, you going to have.

00:27:54 DEBJANI GHOSH: But if you are a VC, where will you put your money today?

00:27:57 KUMAR RANGARAJAN: So I think there I will put my money in like if my if I am building and for example, if I am solving a hard infrastructural problem in a way that it is not trivial, right that as.

00:28:10 DEBJANI GHOSH: Which is rare to fight. Which is many rare to fight. Yeah, yeah. So you are able to.

00:28:16 KUMAR RANGARAJAN: Not do something interesting in that space that is applicable to a lot of people that is not going to get taken away by the open as of the world quickly. They are not their models. That is a shot so but that's that's what makes it interesting. It's like the chip designated TSMCS of the world. The ASMR of the world are going to be far and few, but they have the highest things to.

00:28:35 DEBJANI GHOSH: Be able to, yeah, yeah, they control the value chain.

00:28:38 KUMAR RANGARAJAN: Value chain, yeah, the vast majority is going to play the application space, yeah, thousands and millions of use cases that you can now play with. So like for example, the ecommerce server, distant ecommerce, well alone there are different partners all and there are different different domains like there is in the banking sector, financial sector. So like there are lots of obligations that people can go after and start to solve for sure that's where the bigger use cases are. But I think as an engineering, I think like I think that if you can solve creative interesting stuff at the horizontal layer, that is where I think the kick and the bigger challenge.

00:29:11 DEBJANI GHOSH: OK, he is speaking from the engineering world of New York.

00:29:16 KUMAR RANGARAJAN: The biggest value creation. It is a hard value creation. 

00:29:24 DEBJANI GHOSH: No, no, I completely hear you and I agree with you. And in fact, there is a very interesting data point I stumbled upon recently. The top three cloud providers collectively have a have a value of around 2.1 trillion or something, right? Collectively. And the top 100 cloud provide companies like the Netflixes of the world. The cloud companies have the same value, but there it is controlled by three and here it is controlled by 100, right? So it is, but again for startup founders, right? Should they be thinking about building their own models today? Is that a good use of their investors already very limited resources and time or should they be focusing on applications?

00:30:08 ABHINAV AGGARWAL: I think it's so model bash ability has been established. Like Falcon came out, big fanfare lasted like 14 days, and then Big Data breaks came out with their model that lasted like four days because they didn't attend. And Llama 3 came out like right after. They were the best when I went on. And the open source world, I mean, moves very cutthroat, right? Everyone just migrates to the best and you are in a builder's language model has a tough time because monetizing it is yeah, and the space is moving too fast. Unless they really lead by that model.

00:30:42 DEBJANI GHOSH: Or they find a difference. Yeah, they find some major differentiation, so.

00:30:46 ABHINAV AGGARWAL: Like and hopefully and they have to hope also, right? So like if I'm building a large language model, that's let's say for Indian languages, which quite a few players are right, that disruption is around the corner, where is some bigger player does that. So that's, that's that's why it's a little tricky high risk worth pursuing because it's like, that's like going to the casino table, to the rule table.

00:31:11 DEBJANI GHOSH: Yeah, but you know, I will, I will slightly different perspective there. I think India does need a language model. Absolutely. Because just like cloud where we didn't invest at the right time and today all of us sit and say, Oh my God, India does not have a cloud player, right. And all our data is going somewhere else. I think not ten years, five years down the line, we will look back. If we don't do it now, we will look back and say, Oh my God, we we have we are playing at the bottom end of the value chain and we just don't have any control on the value chain because we didn't build our own model. But I do believe that something that government should get involved in with academia to build that model and I am not so sure if companies and startups should do that. Maybe participate. It could be your Team India kind of an effort. But I could be wrong with just my personal point of view. But I honestly think that for founders, the space that excites me the most is playing with the company's data, you know, and, and creating what, what I want you to talk a little and your work connecting it to the model to figure out how do you, how do you solve a lot of their automation workflow challenges, right? I think that's where the immediate value is where we're in. And you, we recently did a workshop for all the public sector banks and a lot of companies and startups had been invited to showcase what they were creating. And Abhinav did a demo, which I think wowed everybody, all the CEO's and CTO's who had attended. And it wasn't about like jazzy stuff, it wasn't about videos and this, it was very simple workflow automation, finding that information which they don't know where to go find, right. Talk a bit about that. The problem that you the what you're doing with the banks.

00:33:11 ABHINAV AGGARWAL: I think just through the previous part of the model gets solved by I feel it is my personal opinion is that if India does an open source approach, because what Meta did successfully, right, we can build it and we have the talent. But I think that one company that will fund it or and then run it and then be that agile to build it. I feel that's challenging. But if India comes together, the open source community and says we'll build India.

00:33:35 DEBJANI GHOSH: Then Yep. Yep, that open source community is key. Absolutely.

00:33:40 ABHINAV AGGARWAL: But you're right, I think we've seen most. So that's a challenge even at lower down the order when you go more to the core tech it, it, it gets disrupted very fast. You need to be sure about what you're doing. I think the value capture is in that if you can deliver ROI. So if you can save a band $5,000,000 and say, OK, now you give me one million of that, that's very easy value capture.

00:34:01 DEBJANI GHOSH: Can you share what you had created the the?

00:34:05 ABHINAV AGGARWAL: We've done like these. I I think like a banking automation use case, right? They had we took we built an internal GPT for the entire back.

00:34:14 DEBJANI GHOSH: Using their data.

00:34:16 ABHINAV AGGARWAL: Right. So they took and they didn't just take their text data, right? So they had all the SOP processes, product information, etcetera. So they put all of that into an internal GPT. Then they connected all their systems as well, right? So they had data lying in Snowflake, they had SQL, they had an ERP and they connected the AI to that. And then suddenly someone could just go in the system and say, how do I do X? And I would give it. Then they moved on to say, OK, when did the last 3 * X happen? What is this one's, what are our top three customers by balance, etcetera. And suddenly the data is starting to flow. And then they went a route further. They said, can we actually take action as do this update in exit? And then it would have those checks and balances built in. I think that brought them the efficiency that that brought that team of I think it was about 500 people. They, I mean they became 40% more efficient overnight, right? And and then you multiply that over 365 days, the amount of saving that they were able to bring in by the process automation, right? Their turn around time on customer request queries went down by 40% their time to, you know, operationally complete, let's say if it's a KYC on board or if it's a loan origination on boarding that right now by 35%. And so just simple tweak in the in the system you in this series seems became far more efficient. I think the value of the application side and the use key side is just really big and I think that's where lot of startups should play.

00:35:44 DEBJANI GHOSH: I honestly believe that if for the startups that are still early stage and figuring it out, they should focus on this space, but instead of being generic, pick pick verticals and really go deep. It's about how deep you can go and what kind of problem you can solve. That's where you build your so-called Moat. That's where you build a fensible solutions, right? But I very worried about the ones that are focusing on these generic wrappers. I know rapper is a bad word, but I do not know how else to describe it. But I, I do not know how they are going to defend their solutions because what we are seeing the open AI do etcetera. You are not going to have needs for a for a lot of these. But while this is where I can go on talking about these topics with you guys for your hours, but I do not want to touch for about a few other things very quickly, right? Especially in this this fast changing world of AI, keeping up with compliance regulation becomes extremely important. You talked about how industry lacks the mindset or has to build the mindset, the change mindset. But even startups have to build that culture of change, the culture of fail fast. Our education system doesn't prepare us to fail at all. Forget about faster slow, right? So how how much time as founders or Co founders are you all putting on these priorities? You know, building that organization, building the cultural compliance governance, which has honestly, you know, become a bad word associated with Indian startup, sadly. What is the priority you guys give to these things?

00:37:30 KUMAR RANGARAJAN: Whether you're in a startup or not in a startup, I think this.

00:37:32 DEBJANI GHOSH: Is. Yeah, yeah, it's universal. Yeah, like.

00:37:34 KUMAR RANGARAJAN: Every as a order you end up not just focusing your product. There are so many other things that you ought to be able to now split your time across. So that part I think is there when it comes to AI specifically, there are additional things addition that starts to come in. One of them is this constant flux that is happening around the ecosystem that creates a lot of FOMO which associated OK, fine. So yes, fine, you have been doing something on top of this particular model. Now something else come, why are they going to move fast That is good. Some of them improvement that keep happening assuming somebody else tries to build something similar open is build this new feature which is this fear that is going to completely disrupt everybody. And then there is a lot of that starts to come in and that starts to earth morale in the company that starts like for example, when GPDS was launched last year, there was this thing about like it's going to kill all jackboard companies, but I don't mean nothing. Yeah, yeah. So you have to be able to now take that to be able to now understand what separates the hype from there. So that was the additional layer that starts to come in in the day. But in general, I think like the gentle complaints and team morale and team things. I think as a founder, like like when I was the first time founder, bulk of my time was like and a small team was like all in tech and building and customers in that part. Now as a second time founder I released my time was more on people know what they do, Just let them do their job. Just take away all their fluff work away from them. My job is just to take care of all the fluff work from them and then like just tell them what is the right thing and then they will just go just be as away from their stuff and then take away. And you've let the product focus on this, do you? I don't know if I'm making it clear, but essentially we already now don't get too involved into the day-to-day stuff because there are a lot of other things.

00:39:19 DEBJANI GHOSH: Have different roles. Have clear roles for.

00:39:23 KUMAR RANGARAJAN: Roles to be able to now yeah, yeah and then let people execute on this and then BI think one thing that B did or I think come founders typically take I think you like so for example, they don't get a good C starting with them right you really get a good C, Make sure you need to get somebody with whom you can trust, like because a lot of times like same as with lawyers, for example, someone will listen to you can listen to, you can trust because he's not on his ball because just because you're paying him because.

00:39:57 DEBJANI GHOSH: He's not. He should not be saying things too easy.

00:40:01 KUMAR RANGARAJAN: From a lawyer perspective, like my previous company and he got acquired by Facebook, we actually did not go with the lawyer. We actually went to the consultant because we was a legal consultant, not a lawyer by provision, but was worked with tons of things because we knew that person was always not going to play on my side. He's going to be very objective, he's going to be very fair, he's going to tell me what is the right thing to do. And then so you'll be able to find the person with whom you can trust. And so now lawyer is always on record for normal thing. But see.

00:40:28 DEBJANI GHOSH: The role of California is so important and an independent California who has a voice has to be able to hold up. Abhinav, you Raghav and you are very different individuals right? He is very, very different and you are very different. First of all how important is it to have a Co founder in today's and how important is to have a Co founder with different maybe skills and different attitude or thinking than you?

00:40:58 ABHINAV AGGARWAL: It's super important. I think it helps. It gives you that out different perspective.

00:41:03 DEBJANI GHOSH: You have three other Co-founders, right?

00:41:06 ABHINAV AGGARWAL: I think like for example, you know, in, in our initial days, this is a a long time where it 6-7 years ago we I would always see because you brought up regulation compliance and I would always see it as that, you know, going really fast and break in the.

You down and oh God, now we're building amazing tech. What? What about regulation Compliance, right? But it's a very interesting thing, right? If you look at the Bray, it's actually the component that doesn't slow you down. It's the component that allows you to drive really fast absolutely and only go at 200 kilometres per. We have a very reliable. So I think having that Co founder give you that perspective, sometimes it helps. And I think for us we now see regulation compliance as that Bray. It allows us to go really fast with enterprises.

00:41:52 DEBJANI GHOSH: Provided you comply.

00:41:54 ABHINAV AGGARWAL: Because once you have your ducks in order, right? And they, they appreciate it and they need the same thing. You bring those two together and you're doing an enterprise solution. There's a great value differentiator there. And it just allows you to deliver value at scale without getting worried about oh, that they, they this nuts and.

00:42:10 DEBJANI GHOSH: Yep, Yeah, I love it. I think so. He comes up with good ones.

00:42:20 KUMAR RANGARAJAN: Appliance requirements are speed break but you being compliant is like.

00:42:25 DEBJANI GHOSH: Yeah, that mindset, that thinking is so, so important for founders. I think that is brilliant. And, and what I know of his Co founder, I think if if there's a really tall mountain, Abhinav will be the one saying, let's jump. This looks really exciting. And Raghavan is back to say, let us think about it twice. And I think that is so important, right? I mean, to have that kind of somebody who sort of, I think definitely you should be able to work together, but you should not be agreeing on everything. I think it is so, so important. The next two questions are rapid fire, so quick answers because we are totally out of time. But this has been a fantastic conversation. So we talk about Viksit Bharat, where India by 2047 becomes a mature economy and all fingers crossed, really believe it is something we have to do. But the role of innovation in deep tech innovation is going to be critical. It is, I believe it is the make or break it. It has to happen and we are. We still have a very long way to go when it comes to true innovation, true R&D and true deep tech, right. If you could ask for one or two things to be changed that would enable that accelerate that growth, what would it be?

00:43:44 ABHINAV AGGARWAL: I think a fail fast culture in India, in India we lack that OK, both from a large and a small organization and I think the access to create hardware.

00:43:59 DEBJANI GHOSH: Hardware infrastructure is so important. Agree, agree. Come on, I.

00:44:03 KUMAR RANGARAJAN: Think like I can take an example from your chip industry. I think TSMC mind is a good example of like how Taiwan became a small country to the singular entry point into all of.

00:44:15 DEBJANI GHOSH: Brilliant one. Yeah, yeah.

00:44:17 KUMAR RANGARAJAN: So essentially that is like it's a belief in the system that I want to be able to now build these amazingly world class system, but we need to know invest in that. Yeah, that again comes back to the other thing we're seeing like the infrastructure is the one that the one that everyone uses as what is going to make you truly critical in the key chip in the whole flow. But but the other people now the belief that you can actually be the leader, you can actually win that battle. Yep, having that mindset to be able to now say we can actually play hard problems. So on hard complex problems and actually do it better than anybody else in the world. I think this is something that we as a containing with typically into like we are like OK, we.

00:44:54 DEBJANI GHOSH: And we have to commit to it.

00:44:56 KUMAR RANGARAJAN: What is that easy things that we can do, What like things we can do I could quickly make. That's great. That's important. It's not like that. But if everyone is focusing on that, then we will never become that.

00:45:07 DEBJANI GHOSH: It is not going to get us to the to the to the dream. Yeah, we.

00:45:10 KUMAR RANGARAJAN: Will be sustainable and everything but not go to that next thing I think so that is that government and for that I think like one as a mindset which has to coming to us and also from a governmental perspective that we would now encourage with the risk and Investor committee to be able to now do this. Another thing I think is to be able to now encourage like talent. Talent is going to be super critical to mix up.

00:45:30 DEBJANI GHOSH: Absolutely.

00:45:31 KUMAR RANGARAJAN: There are a lot of the things that is being built there are actually built by people like us. Now. How do you know encourage them to come and do it, which is exactly what TSMC did and everybody else brought the people from Texas Instruments and Intel to come and work in Taiwan. How do you get that to come in? How do you build that ecosystem and.

00:45:48 DEBJANI GHOSH: Become a magnet for the best talent in the world.

00:45:51 KUMAR RANGARAJAN: From everywhere.

00:45:53 DEBJANI GHOSH: No I agree. Agree, but the best talent in the world is mostly Indian, which is which is a benefit.

00:45:59 KUMAR RANGARAJAN: Yeah, but what else? No not limit to that like.

00:46:02 DEBJANI GHOSH: Yeah, Yeah, exactly. Yeah, No, I, I love that. I think both are both of you have shared excellent term asks last question, the next big thing on the horizon that excites you.

00:46:14 ABHINAV AGGARWAL: I think the potential of hitting AGI essentially artificial general intelligence where the these models are like, just like thinking machine and like you like you can't tell whether this is a human and the day they, I think the hell is going to change overnight again. It's already.

00:46:35 DEBJANI GHOSH: Yeah, you know, I, I the, the, the concept of AGI and what it can do and the problems it can solve tremendously excites me. What scares me is AGI being controlled by one or two companies, and I think unless we figure out a way to build equity or really be democratize the access to AGI otherwise, that it scares the life out of me. So yeah, yeah. What about you? What excites you the most?

00:47:08 KUMAR RANGARAJAN: For me, I think like how do you make AI get my parents to be excited about AI, which is already happening, right? So it's like like a lot of times initially it's an enterprise which starts to adopt. And so that's a very, in my mind, OK, that's one class. It's great productivity, it's a good money making use skills. But how do you make it democratize? How do you make common man like Internet became great, but as long as the remaining said corporates it's great but once it went out to common man to be able to use it was just made a whole difference. Mobile phone same thing. It was like not just communication sat phones or the earlier but to the common cheapest anybody on the road can now use a mobile phone same thing. I think we'll start as it starts to happen, like more and more consumerization of the stick and the things that we saw as it was for that should in this things like and the tech is not getting ready. And hopefully company like us will make it more accessible for more and more people to explain to that. To be able to now bring in AI into your day-to-day stuff and make your day-to-day life so much more easier. And our day-to-day practically like, like I was recently on a trip to in Germany, right? And it was A and then and there we were like having discussions and what we used to do earlier, like, oh, when Google changed a lot of the way conversations are OK. Whenever there's a discussion, you'll go to Google research and then come back and do this. But now, now chat GPD became my go to answer and I complex things. There was some sign board we saw, which is a crossed out sign and we're wondering what is that crossed out thing? We just asked. It was able to brilliant. Your proper answer. So that come on consumerization day-to-day use cases starting now become possible with AI. That is what relates to me and that the tech is really, really catching up with it. Like voice becomes a critical form in which you ought to be able to because.

Multilingual voice and the most interesting thing that this demo showed was interruptible voice yeah like if you see like this what used to happen with AI because we were always a voice as in company like I since I come from that we were always a voice as in company. And what is to be boring about voice as to before was like if moment the assistant start speaking you cannot make it stop you have to wait or you have to say stop and do all things. But now like when you are having conversation you ask now you could not do that. That kind of a naturally interfaces.

00:49:22 DEBJANI GHOSH: That yeah, yeah. That was the most exciting thing about the GPT photo demos, right? Being able to control that voice and tell it to speak louder or speak emotionally while it is speaking. I think that that to me was the aha. That was the big aha, but this has been Absolutely Fabulous. Thank you both. Brilliant conversation as always. I hope you guys enjoyed it and I hope there were some good learnings from you all and we will continue to bring you some of the movers and shakers of technology in India and continue to have these conversations. Thank you.

00:49:56 KUMAR RANGARAJAN: Hey, thank you.

00:49:57 ABHINAV AGGARWAL: Thank you. Bye.

00:49:59 KUMAR RANGARAJAN: For more content on tech and leadership, subscribe to Nascom YouTube channel and press the bell icon to never miss an update.

Related Voices

Unlocking the Potential of AI through Co-creation with Startups

Unlocking the Potential of AI through Co-creation with Startups

In this week's edition of Times Techies Webinars, industry leaders come together to discuss "Unlocking the Potential of AI through Co-creation with Startups.

Top women leaders from #TimesTechies discuss building the next generation of women leaders in GCCs

Top women leaders from #TimesTechies discuss building the next generation of women leaders in GCCs

Top women leaders discuss building the next generation of women leaders in GCCs, featuring a panel of experts, Veda Persad, Country Executive, Northern Trust Corporation, Lalitha Indrakanti, CEO, Jaguar Land Rover Technology and Business Services India and Sindhu Gangadharan, Vice Chairperson, Nasscom and MD, SAP Labs India

Unlocking New Value Levers For BPM In India: The Role Of Data, Tech, Skill And Customers

Unlocking New Value Levers For BPM In India: The Role Of Data, Tech, Skill And Customers

Unlock the future of BPM in India with our expert panel! Explore how data, technology, skill, and customer-centric innovation are redefining industry landscapes.

Revolutionizing Customer Experience: Innovating Service Hubs Through Technology

Revolutionizing Customer Experience: Innovating Service Hubs Through Technology

Our panel of experts explore how businesses can elevate customer experiences by revolutionising service hubs with the latest technology.

Leading at the Speed of AI | NTLF Conversations

Leading at the Speed of AI | NTLF Conversations

In a disruptive era, leadership is the cornerstone of navigating uncertainty and driving organizational success for GCCs. Join this session where Gunjan Samtani, Country Head, Goldman Sachs Services India and Global Chief Operating Officer of Engineering, Goldman Sachs and Sunil Gopinath, CEO, Rakuten India Enterprise Private Limited in conversation with Debjani Ghosh, president, nasscom explore the evolving role of leadership and how they are creating the AI edge for the organizations. Hear how our forward-thinking leaders are harnessing AI technologies to drive innovation, streamline processes, and unlock new opportunities for growth.

Spread the insights!

Close Icon