• 3 months ago
In the dynamic world of mobile applications, harnessing the power of data science is crucial for optimizing testing processes and delivering exceptional user experiences. This on-demand webinar, presented by HeadSpin, will delve into the transformative impact of data science on mobile testing, showcasing how enterprises can leverage advanced methodologies and tools to enhance efficiency and application performance.

Watch us as we learn about HeadSpin's innovative approach to testing, which goes beyond traditional pass/fail metrics to provide multi-dimensional insights through its integrated AI algorithms & ML models. Discover the importance of recording the right key performance indicators (KPIs) and how they can significantly enhance testing efficiency and effectiveness.

To learn more about HeadSpin, visit: https://www.headspin.io/
Transcript
00:00Good morning, good evening to you.
00:02We are very excited, Vishal and I are too excited to talk about a couple of things
00:07that we have been doing with a lot of customers here.
00:11We have been seeing some trends happening.
00:12We have been seeing change in the requirements from a QA perspective
00:16and how the shape of the industry has been transforming,
00:21how the industry has been demanding the QA to be more data-oriented,
00:27how the things have evolved from the dot-com side to having more mobile apps
00:32and then LTE and 5G coming in.
00:34And then now we see the whole boom of mobile technologies.
00:38We have been seeing great, great trends and usage of mobile.
00:41And with mobile, even if you're a retail company or if you're a medical company,
00:48if you're a company which is dealing in manufacturing
00:50and you have to have an application,
00:53the benchmark of an application performance is compared directly with the digital natives.
00:59So you're not competing against your competitors in the segment,
01:02you're competing with the Metas and the Googles of the world.
01:06And hence, it was very important to move away from the subjective way of evaluating experience.
01:17We wanted to go more objective, setting up the right KPIs.
01:20We wanted to talk about how you can really measure experience
01:24and then evolve that over the period of time.
01:28So without wasting a lot of time, I would just want to introduce the panelists.
01:34Hi, my name is Kishan.
01:35I'm a global partner lead at HedgePin.
01:37I've been doing this job for about two and a half years.
01:39And successfully, we have been having very, very amazing partnership,
01:45a phenomenal partnership with all the technology companies
01:50with a lot of service integrated companies.
01:52And one of the strategic partnerships that I have been running
01:56for about two and a half, three years now is Capgemini.
01:59It happens to be very special because I have spent a lot of time in Capgemini myself.
02:04I have been part of the Capgemini family.
02:06And today I'm speaking with Vishal Arora,
02:09who is a senior director who is based in Los Angeles, California.
02:15He brings about 26, 25 years and more of experience,
02:21working with the biggest logos, with the biggest companies,
02:25with the likes of Accenture, with the likes of Microsoft in different roles.
02:30His role has been a large contributor in digital transformation, quality assurance.
02:38He has been one of the leaders to drive innovation
02:41for a lot of utility customers as well.
02:43And Vishal and I have happened to work on a couple of customers as well,
02:48where we have helped or used Hedgman as a technology
02:52with Capgemini's expertise to solve a couple of challenges for them.
02:57And we'll be keen on talking about a couple of these pointers along with all today.
03:03And we'll be talking about some of the global norms.
03:06We'll be talking about a couple of industry trends.
03:10How is it shaping up?
03:11And then if you have any questions, we'll be happy to answer.
03:14But before we jump on to the topics of the day,
03:16Vishal, welcome to the webinar.
03:19What do you want to...
03:20I know I've been introducing you like this to everyone,
03:24but what do you want to introduce yourself again?
03:27Thank you, Kishan.
03:28Thank you for inviting me and hosting this webinar.
03:32It's a subject very close to my heart as well.
03:36And good morning, good evening and good afternoon
03:38and good evening, everyone on this webinar.
03:42Happy to be here and happy to discuss this subject as we progress through.
03:48All right.
03:49So, Vishal, just a disclaimer to our audience.
03:52We were going to have a very, very deep conversation
03:56about Hedgman and technology and all of that.
03:59But then we chose to make it more of a fireside chat
04:01and share some of our experiences with our customers that we have had
04:05and how we dealt with it, how we helped them out
04:09and how success came in.
04:11So, to everyone who has joined in,
04:13this is going to be more of a fireside chat
04:15between Hedgman and Capgemini.
04:18Vishal and I are going to talk about it
04:20and just try to set the expectations right.
04:23We'll have our talking session for the first 45 minutes,
04:26but we'll encourage a lot of comments.
04:28So, if you have any questions,
04:29you can post those comments in between as well.
04:32Some of them we'll try to answer during the call.
04:35Some of them we'll try to answer post our discussion.
04:42All right.
04:44So, Vishal, one of the key points that we have been seeing
04:51from 2000s where the dot-com usage was on boom
04:54and then we had 2010 when the mobile came in
04:56and the applications were built in, the Android shaped up.
05:00And then we saw something very different in 2016 onwards
05:03that the whole consumption of media,
05:07whole consumption of retail applications,
05:10any interaction digitally was moving towards mobile.
05:13And we have been seeing that trend growing
05:17and how it has been impacted from a business perspective,
05:21how it has impacted different companies
05:22from different segments.
05:24How have you seen the change happening
05:26in your experience of over 25 years of experience?
05:31How have you seen that change evolve?
05:33Is it bigger than dot-com?
05:35Is it bigger than that?
05:36Or is it smaller than that?
05:38Or is it coming to the end?
05:41Or it's going to grow even further?
05:44One of the studies which I found recently
05:46was how mobile consumption has been evolving.
05:49And one of the reports says that this year
05:52we are estimating 4.8 billion mobile users
05:55and moving forward in 25,
05:56it's estimated to be 5.28 billion.
05:59So what are your thoughts about it?
06:01Yeah, it's a tremendous evolution journey.
06:05I still remember my 25 years ago,
06:08when I came first time,
06:09I didn't even have a mobile device.
06:11Like I'm talking about 1995.
06:14Then I got my first mobile, which was Ericsson phone,
06:17which was like a brick in your hand
06:19with very limited, just to call capabilities.
06:22It didn't even support a text at that time.
06:25And then over the period of time,
06:27now you look at it,
06:28once all the mobile devices,
06:30it's an integral part of basically our life.
06:33We start our day and end our day with that, right?
06:36It has changed.
06:38It has significant impact on humans' lives.
06:42And it kind of has changed, right?
06:45It has changed the ecosystem, how we communicate,
06:48how we shop, how we work, right?
06:50It has touched every single part of our life that way.
06:53So that evolution journey has been very significant.
06:57And as you see the last few years,
07:01the connected devices and wearable devices,
07:05tablets and phones,
07:07and so many different varieties of that
07:09has kind of keep increasing
07:12and keep basically making a significant difference
07:15in how we live, how we commit,
07:17how we do each and every aspect of our life, right?
07:20So it is very important and very close to me
07:23because this is going to continue to grow, right?
07:26The numbers that I think you were sharing earlier
07:29mainly focus on the mobile devices and tablets, right?
07:32But when you incorporate all the connected devices
07:36across the globe and connected device
07:38could be a healthcare devices, wearable devices.
07:42Just to give you a number,
07:43like in 2009, there were around like 2.9 billion devices
07:47that grew from 2024 to probably around 10 billion.
07:52And now basically by 2050,
07:55the projection is that it will be greater
07:57than a hundred billion devices.
07:58So it's not gonna go away.
08:02It's gonna continue to grow.
08:03It's gonna continue to basically,
08:08I'm actually excited to see what new technology
08:11and what new basically capability
08:16comes to the market in the next few years.
08:18But yeah, it has significantly impacted
08:23customer experience or user experience
08:27or how people do things in their life.
08:29And for the business landscape, how they do the business,
08:33how basically they bring the people together,
08:36how some of the businesses specifically run
08:40on the mobile devices nowadays.
08:42So it has a significant impact.
08:44And I see that it will continue to grow
08:48and move forward in the right direction.
08:51Absolutely, bang on.
08:53This reminds me of another point
08:55that was a great marketing campaign back in 2010
08:59when Nokia came in with a tagline called Connecting People.
09:03And that's what mobile has been doing.
09:05So jumping onto more of a contextual setting,
09:09how exactly AI and ML,
09:11which is the talk of the hour today,
09:13everybody talks AI and ML.
09:15And we have been also leveraging AI and ML
09:17in some capacity, data science in some capacities,
09:20machine learning models that HedgePen has also built up.
09:23How do you see evolution of that transforming
09:27how the consumption experience
09:29of mobile technologies being utilized?
09:32How do you see strategies of leveraging
09:35more data science-oriented or data-oriented schemes
09:41in your testing methodologies
09:42or overall in mobile space?
09:45How do you see that trend moving?
09:47And what is our business hour?
09:49A couple of business challenges from a customer side,
09:51if you could just highlight,
09:52that would be more for our audience to hear too.
09:59It again goes back to what we just discussed
10:01about the evolution of the mobile technology
10:04and connected devices, right?
10:06Similar way, like if you look at early years
10:09of this mobile,
10:11when we talk about testing, we test everything, right?
10:14We generally test everything, every aspect of it,
10:17but we are blindly doing the testing.
10:21Eventually some of the more efficient way
10:24or more optimized way was needed.
10:27So people started talking about,
10:28hey, rather than if I have a 10,000 test cases,
10:32rather than every single time
10:34me running a 10,000 test cases,
10:36let me run a small subset
10:38or let me run what is a relevant subset
10:40or let me run the subset which is impacted
10:44by the changes in what I have done, right?
10:47And if you look at it,
10:48all this decision making comes from the data, right?
10:53So people started a data-driven decision making
10:56and start looking at,
10:58hey, these are the areas
10:59where I have most number of defects.
11:01Let me run more tests in this area, right?
11:03Or let me improve my test cases.
11:05So they started making a data-driven decision.
11:08And with the evolution of,
11:09it has progressed through more and more evolution
11:12where we started doing some analytics
11:15and things like that.
11:16And with the boom in AI and machine learning
11:20and everything, what it allows you to do
11:23is not just allows you to basically extract the insights
11:27that you are looking for
11:29if you have a good quality data set,
11:31but it also basically allows you to do a level of,
11:36it gives you an insight,
11:37which actually you might not be even looking for, right?
11:40When you go through this unsupervised learning
11:43insights from this models,
11:45you can find some of the patterns that you think,
11:47hey, I should have also done this or that, right?
11:50So it's a very important part of it.
11:52So I do see that having that data is very important
11:57and AI and ML has improve our ability
12:01to basically understand or extract the insights
12:05from this data sets and allow us
12:08to do a better decision-making, right?
12:11And better decision-making could be based
12:13on the data available,
12:15but also better decision in terms
12:17of predictive analytics, right?
12:19And there are tools and technologies in the market.
12:22Like if I have lost,
12:24if I have done a pretty good job
12:25in terms of maintaining my data around my releases
12:30and also having a basically full traceability
12:34between those datas,
12:36depending on what changes I have made,
12:38I could literally predict that,
12:40that where I will find the defects,
12:43approximately how many defects I will find, right?
12:47And it allows me to basically also do the fine tuning
12:50on what quality I want to do
12:52and for to achieve that quality level,
12:54what are the things I need to test, right?
12:56So that predictive analytics
12:58and everything is very, very powerful.
13:01Also with the help of AI and machine learning,
13:05there is lots of improvement in test automation,
13:08like intelligent test case generation,
13:11intelligent test automation generation,
13:13some of the optimization capabilities,
13:17some of the self-healing capabilities, right?
13:20Also you can do a faster and smarter testing
13:23with the prioritization and optimization
13:25across the board.
13:27Then you can kind of also look into the areas
13:30like performance, security, user experience
13:34and many more areas where this insight
13:37has allowed you to do a better, faster
13:42and more efficient way of testing,
13:44which saves significant time for an organization,
13:48which allows them to use that cost saving
13:50or that optimized time to focus
13:52on other high priority actions or items.
13:55I hope I answered your question.
13:57No, no, absolutely.
13:58This was absolutely very contextual
14:01and context setting because the next slide
14:03that I'm going to talk about is how Hedgspin brings
14:07the differentiator in the market,
14:09how we are handling it.
14:11So Hedgspin is, you have been using it,
14:14so it's for the audience.
14:16Hedgspin is a data analytics, data science platform,
14:19which lets you run your user journeys
14:22before even the release on the right environment,
14:26which is the environment as close as a real user.
14:29So think about your test environment
14:31as close as your real user environment
14:33and then you would run your user journeys pre-release.
14:35And then we have contextualized
14:38or we have put in the right APIs
14:41to make your experience more objective
14:44than being subjective.
14:46So what we do, Vishal and all the others,
14:49what we do is we let you run your pre-release
14:52and post-release applications on Hedgspin
14:54infrastructure.
14:55And when you do that,
14:57we analyze each and every section of your performance
15:02on different parameters.
15:04You know, these could be,
15:06these could be, you know, operating system,
15:09or it could be packet level detail
15:11or it could be, you know, UI level
15:15or it could be, you know, anything
15:16which is making your consumer
15:21not having a great experience.
15:22Just imagine if you're using one application
15:25which loads up a little late.
15:28Then your test case is passed,
15:29people go to your test case,
15:30Hedgspin analyzes every test case of yours,
15:34every user journeys of yours
15:35and tells you how much you want to go to the market
15:40and use, you know, the 560 models
15:44that were built before.
15:50Now these ML models are, you know,
15:52to find out issues on and brings me
15:55to one of the sites that is here.
15:59I would want you to look at the centerpiece
16:01of the site where every test journey
16:04that you run on Hedgspin is actually analyzed.
16:09Are you able to hear me now?
16:11Is it breaking for a bit?
16:17Hello.
16:22Can we have a little bit of break?
16:28It means at least I saw
16:30that their voice was breaking a little bit.
16:33I'm not sure if the other...
16:34Oh, wow.
16:34Is it still breaking for us?
16:38This is better now.
16:42So this is exactly the kind of issue
16:45I'm talking about, right?
16:47The software is right.
16:49Everything is right.
16:50But because of my network conditions,
16:52you guys cannot tell me so well.
16:56This is a real example
16:57of having, you know, a poor experience, right?
17:02So Hedgspin identifies
17:03such experience pre-production.
17:06So every time, you know,
17:07your build is ready to go into market or not,
17:11you every time know
17:12what are the engineering needs.
17:16Even if it passes,
17:18you have the recording of the screen
17:21you know, along with the data KPIs,
17:23which are 130 data KPIs we capture.
17:26So per sub-second basis,
17:27you can imagine if I collect 130 data KPIs,
17:30for a second, we do, you know, 13,000.
17:33And if your user journey goes
17:35for five minutes or 10 minutes,
17:36which is very usual today,
17:37then it's a plethora of data
17:39that we are creating out of it.
17:40This data is then, you know,
17:44it goes to the engine that we have.
17:47And these ML models are basically
17:51designed to find such glitches.
17:53And it not only finds out those glitches,
17:55but also tells you how to rectify them
17:57where the problem lies.
17:58So your RCA is basically not days, but seconds.
18:03This is what we do.
18:04We also enable you to benchmark your experience
18:07over different dashboards.
18:09So your business, your QA, your program manager,
18:12your product manager, as well as your CIOs and CTOs
18:15can use the same data
18:18and see a different picture for themselves.
18:20So we give you alerts and watchers
18:22every time it fails for your user.
18:24So very proactively,
18:26you will know exactly where it's from
18:28in production as well.
18:29And all of this can be done manually
18:31or through automation on a variety of endpoints.
18:34These endpoints could be, you know, mobile,
18:36tablet, and all the connected devices
18:38that Vishal, you just talked about.
18:39We do all of it remotely.
18:41So you're sitting in Los Angeles today,
18:43can operate a device in Bangalore,
18:46and I sitting in Bangalore can actually operate a device
18:48in California right at the moment with zero glitch.
18:52This is what we do.
18:59So Vishal, you've been working with a variety of customers
19:04and with your experience of 2.5 so far,
19:09what do you think is the new challenges
19:12that organizations are facing in terms of testing
19:15while mobile is the new one coming up
19:17and the consumption is so high?
19:18What do you see as challenges
19:20and what are the role of AIML models
19:22in addressing these challenges?
19:24It is more contextual now
19:25for the audience to, you know, know now.
19:30So there are challenges
19:34with the mobile device testing
19:36is one part of the basically a focus area, right?
19:41But when you talk about a larger ecosystem,
19:44there are separate set of additional challenges
19:47also when we talk about the testing, right?
19:50Ideally, means from the business standpoint,
19:53if you look at it,
19:56most of the time,
19:57when you talk about the business goals,
19:59it always translates into a cost quality and time.
20:03And nowadays, probably we could also add
20:06a spin off experience into it, right?
20:09So if we are talking about achieving the business goal,
20:13that's how I can improve the quality,
20:16how I can ship or release the qualitative product,
20:19which is very important,
20:21how I can do it faster
20:22and how I can do it in a cost efficient manner, right?
20:26And yes, mobile testing and everything
20:29is always gonna have a cost associated with that.
20:33But if you are doing it at that scale,
20:35then the benefit that you get are exponential, right?
20:39When you talk about the challenges,
20:41if you look at the ecosystem, right?
20:43For specifically mobile devices,
20:46I'm not talking in general with testing right now,
20:48but like for example, device fragmentation, right?
20:52Variety of devices across the board
20:54when you talk about,
20:55let's just take an example of only mobile devices
20:58and tablets,
20:59not going to the IoT connected devices
21:02or any other devices.
21:04Also, each and every device
21:06has a different operating system
21:08that they might be using,
21:09whether it's an Android,
21:10whether it's an iOS,
21:12whether it's something else, right?
21:14And also the different platforms within that, right?
21:17Then you have a network variety.
21:20Some devices work on a Wi-Fi,
21:22some are basically 2G, 3G, 4G, 5G compatible, right?
21:27So you need to basically think about that as well,
21:30because when you are testing the devices,
21:32those basically kind of when you,
21:35performance testing is also one part of it
21:38because some of this network variety
21:42impacts how devices will perform, right?
21:45So you need to focus on that.
21:48Do these devices are integrated
21:49with multiple applications,
21:51different backend systems?
21:53So testing also introduced a set of challenges.
21:57Then when we talk about performance,
21:59it's again simulating basically those network conditions
22:03and things like that.
22:04Performance, when we talk about it,
22:06also introduced the notion of batteries,
22:09memories, and things like that.
22:10So some of the hardware configuration is very important.
22:14Then security aspect is another on the devices as well,
22:18because if you are using these devices
22:20in a corporate setting,
22:21corporations are more physically enforcing
22:25the different securities onto that and things like that.
22:29Then also look at the ecosystem, right?
22:31Nowadays, I was just re-watching something yesterday
22:36and there was a Tesla Model 3 example
22:39somebody gave me where it has some
22:41break-related performance problem
22:43and it wasn't breaking in a required timeframe
22:47and all Tesla needed to do was the software update.
22:51It's a kind of a connected device story.
22:53But if you look at it,
22:54the reason why I brought that example
22:56because in mobile devices,
22:57how many updates and patches we get, right?
23:00Most of the time it comes.
23:02So now testing with those updates and patches
23:05also introduce the more challenges
23:06because more often they come, right?
23:10When I used to work for Microsoft,
23:11we used to release the product every 18 months or 20 months.
23:15Nowadays, when people do the real multiple releases
23:18in a day, right?
23:19So ecosystem has changed, right?
23:21Then user experiences, real devices,
23:24all introduce the challenges.
23:29Absolutely.
23:29And in fact, contributing to that point,
23:31every single release that you put out,
23:33even if it's a smaller release or a large release
23:35or a big release, right?
23:36You have to do a lot of regression,
23:38making sure that your old functionalities
23:40are not going bad, right?
23:41And that's why we see a lot of AI and ML coming in,
23:44automation coming in,
23:45making sure that you identify and make sure
23:47that your application health is secure.
23:50So if you look at this, right?
23:51Means the reason why I listed all these challenges
23:54because it's not just the one thing
23:56that you need to test.
23:59When you talk about a mobile testing,
24:01it's not just putting the mobile in your hand
24:03and start testing.
24:04You are testing so many different variety of things, right?
24:08That's where the automation comes into the picture.
24:10That's where the parallel execution
24:12comes into the picture.
24:14Some of these capabilities,
24:15which basically Headspin provides as well.
24:17That is why for one of the client,
24:20when you and I started working,
24:21we choose that because it enables us
24:25to do all this type of testing
24:27in an automated manner, right?
24:29And AI plays a very critical role
24:31because think about doing a manual testing with your end,
24:34but now I need to run a 10 different type of tests, right?
24:39Yes, I might have an automation and things like that,
24:41but it will take a significant amount of time.
24:45So I want to do it in a more intelligent way,
24:47more smarter way, in a more optimized way.
24:50And that's where AI and machine learning model helps me
24:54because what it does is that it allows me
24:57to analyze the data, data from past releases,
25:00data from the changes I'm making,
25:02data from all the previous executions
25:05and things like that, right?
25:07Based on that data, it gives me more insights
25:10into what I should be testing,
25:12how much testing I need to do, right?
25:15It just depends on what model you have developed,
25:18but what I need to do,
25:20what basically the level of compatibility testing
25:24I need to do across the OS, different platforms,
25:27different browsers, different type of devices, right?
25:30So AI and machine learning is very,
25:34very important in many ways.
25:36One, it allows me to basically do
25:39the insight-driven testing.
25:41It allows me to improve my quality
25:45by letting me know that which areas I need to focus on.
25:48It allows me to basically optimize my test
25:51to basically tell me that how quickly
25:53and smartly I can get that done, right?
25:56So, and also it allows me to basically identify
25:59the areas where I should be focusing on more, right?
26:03Which is very important part of it.
26:05Like, for example, we have a production log
26:08for design and develop a model which analyzes that,
26:11and you will see that where exactly your errors are
26:15or where exactly the challenges are,
26:17which are logged, which your customer is facing,
26:19so you can improve your test case areas
26:22and things like that, right?
26:23So I do see that my AI and machine learning base
26:28basically data insights that you capture
26:33will allow you to do a better job
26:36in terms of your mobile testing
26:38or any type of testing,
26:40but especially when it comes to mobile,
26:42it's significantly helpful.
26:46Absolutely, Vishal.
26:47And this is exactly the point you just touched based on.
26:49It's not just the testers anymore,
26:51it's the product engineers,
26:53the monitoring comes in,
26:54the logs capturing comes in,
26:56your own production mapping comes in.
26:58And this is exactly how we help organizations today, Vishal.
27:03So readiness to go out in the market is pre-production
27:06and utilization comes from all the four,
27:08five different pillars of your complete agile cycle,
27:13your DevOps cycle, right?
27:15We help them do all the functionality tests,
27:18but when you do the functionality test,
27:20we run advanced data analytics on the platform
27:22and tell you identified areas of improvement
27:25and tell you how good or bad your build is to go out.
27:28But when it goes out,
27:29we do a lot of proactive monitoring
27:31on your production environment as well.
27:33So just like your 100,000 or multiple 100,000 users,
27:38we also have some of the simulated users out there
27:42who eventually use it on regression basis.
27:48So there are continuous number of scripts
27:50which are running on your application
27:53and it's continuously running.
27:54So if something fails for your actual customer,
27:57it fails for Headspin also.
27:58And Headspin will tell you very proactively
28:00the areas, the performance improvement degraded
28:05and areas it may fall short,
28:06it is falling short and how you can optimize that.
28:09So it's very proactive measure that we are doing.
28:12In my politics program for a retail customer, right?
28:16This is the kind of value we bring in.
28:19And with the, I mean,
28:22we have covered a lot of these pointers already in place.
28:24So the future of mobile application testing
28:27is what we just talked about, Vishal,
28:29in the last point as well.
28:30So I just move on so that people can ask questions.
28:34You know, how can we have...
28:38I would like to add one more.
28:39Go ahead, Vishal.
28:40...to that, right?
28:41Means we hear lots of terms, right?
28:43You know, the risk-based testing
28:45or data-driven testing
28:46or self-healing part of it, right?
28:50But think about this, right?
28:52When we talk about AI, machine learning models
28:55and now the capability of generative AI, right?
28:59Yes, it's not gonna be that mature initially.
29:02It takes time.
29:03It takes basically a good amount of time,
29:07good data set and good basically enhancements
29:11that you need to make in your model
29:13to generate a more efficient model,
29:15which gives you the efficacy is very high, right?
29:19So an accuracy is very high.
29:22But think about this,
29:23that the future where you have an engine
29:28where you just provide the requirements
29:31and from that requirement,
29:33leveraging all the past data and everything
29:36and using a generative AI,
29:38it can create a test cases.
29:40It can create a test data.
29:42It can create a test automation
29:44and you have encapsulated everything
29:46in an automated engine.
29:49Everything can be done on its own
29:51and introducing a cell
29:54and this is not the reality right now.
29:56So I will be very clear on that.
29:59But some component exists today, right?
30:02At any, and I do see that
30:04that as we move into a future,
30:07more and more automation,
30:09more and more optimization,
30:11more and more insight-driven testing will happen
30:15where most of them will be automated.
30:18Still human intervention is required
30:20when it comes to providing a supervision
30:23and kind of also supporting the learning aspect
30:29of those models and things like that.
30:31But I think that it's an exciting new world
30:35and it's gonna keep improving
30:36and keep getting better.
30:38And that's where some of these technologies,
30:40like for example, what you have
30:42in terms of providing the capability
30:45to do the testing on a real-time devices,
30:48and your basically engine,
30:51analytics engine in the background,
30:53which basically captures the data
30:54and provides those insights
30:56to do the better testing.
30:57I think I'm pretty sure
30:58that you guys might have a roadmap
31:00to keep adding more and more features
31:02as we move forward,
31:03which will help organizations.
31:05But it's an exciting new world
31:07and it's keep gonna get improving
31:09and keep getting better.
31:13Absolutely, Vishal.
31:14Vishal, we are actually committed
31:17to make our platform much more stable.
31:20That's priority one.
31:22We have already built up a platform
31:24which is the best in the market today.
31:26We have made the most secured platform
31:28and hence we were the choice
31:30where we came in picture
31:31and we talked about it.
31:32So that's one.
31:34We are working on a couple of prototypes as well.
31:36And the coming months,
31:38when I'm talking,
31:39I'm not talking very far off.
31:40I'm talking Q1 25.
31:42We're having some major releases coming out.
31:46We are having some exciting news coming out.
31:48So I'll keep you posted on that, right?
31:51So I'll make sure that
31:52you are the first one to be enabled on it.
31:54And we use all of these features
31:56for your customers as well.
32:00Vishal, I wanted to touch point
32:02a couple of things.
32:03While you talked about real inventory,
32:06Headspin actually has
32:07only and only bare metal devices.
32:09We do not have any emulators or simulators.
32:12So we make sure that
32:14all of these devices are as good as
32:21a real user environment.
32:23We have these physical phones out there
32:25which are robust,
32:26the kind of device coverage is immense.
32:28And then we do a lot of routine health check
32:31on these devices,
32:31making sure that uptime is very accurate
32:35and backup and infrastructure is
32:38from a data side,
32:40in the data center, right?
32:41It's this very low downtime,
32:45this negligible downtime
32:46and make sure that it comes at the peak.
32:50One of the things,
32:52I think implementing the text and technology
32:54and choosing the right technology frame
32:56is what you've already answered.
32:58So I'll just jump on to the security aspects,
33:00Vishal, because this is very important
33:02for all the utility customers,
33:03for all the enterprise customers,
33:05for all the banks out there, right?
33:07So they do not want to let
33:09any of their applications go out in the market.
33:11Because it's a very secured application
33:13and it has to do things with personal data,
33:16it has to do things with transactions,
33:19payments and whatnot.
33:21And we have been working
33:23with a couple of banks in India,
33:24we have been working with the largest banks
33:28out there in the world as well.
33:30They have been leveraging Headspin
33:31as a platform for testing out,
33:33you know, they're doing a lot of passing
33:34fail on different device files,
33:36but when it comes to maturity
33:37of their application,
33:39they come on Headspin.
33:40So I want to hear from you
33:41because you're a first user on it, right?
33:44So what are the things
33:45that you felt were more important
33:47and why Headspin stood out
33:49out of, you know,
33:49all the solutions out there for you?
33:52If you could just update Vata,
33:54what is the need for security
33:56in today's world
33:57from a testing perspective
33:58and how do you see Headspin
33:59filling that gap?
34:02No, thank you for that question.
34:03That is a very important area
34:06when it comes to the corporate customers, right?
34:08And also the sensitivity of that
34:11basically depends on the industry
34:13that you are in, right?
34:15Security is the utmost priority
34:17for any organization out there,
34:20but then there are certain domains
34:21where it even becomes the more,
34:26it even becomes the very high priority,
34:28which security is always high priority,
34:30but in some of those domains,
34:33it is basically not just
34:35the one of the requirement
34:36or not just one of the priority,
34:38just a way of doing the business for that, right?
34:41Like, for example, utility customers,
34:43one of my customers where you and I got engaged,
34:46means they are governed
34:49by some of the regulations
34:50and they need to be in a regularity compliance
34:53and things like that.
34:55So there are lots of different things, right?
34:57Especially when you talk about utilities,
34:59means they are basically target
35:01of any type of terrorism
35:03or cybersecurity attacks
35:04and things like that.
35:06Because if they support the day-to-day's life,
35:10they support the infrastructure,
35:12which we leverage in day-to-day's life,
35:14whether it's electricity
35:15or whether it's gas
35:16and things like that, right?
35:18So they protect their network very, very well.
35:22Cybersecurity, security,
35:24cybersecurity aspect
35:26and any other type of that
35:27is very important for them.
35:30Like, for example, for us,
35:32for this customer,
35:33we need to make sure,
35:34means anytime we do something
35:36or bring some new technologies or anything,
35:39we basically do multiple internal reviews
35:41for enterprise architect teams,
35:43multiple reviews with the cybersecurity team.
35:45And you've been engaging some of those, right?
35:48In fact, you have both the solution on-premise
35:52and in cloud
35:52and we choose to go with on-premise
35:55because we cannot put data in the cloud, right?
35:58So some of those challenges are there as well.
36:02And after reviewing all of that,
36:04I've been, was the basically right choice for us
36:07because it allows us to fulfill
36:10all those security requirements
36:12that we needed to fulfill, right?
36:16And it allows,
36:17I know one of our prime requirement
36:19was basically to make sure
36:21that that security compliance is there.
36:24We have an on-premise solution.
36:27The solution allows us
36:28to do the concurrent testing.
36:30Solution that allows us
36:32to basically do the testing
36:33on a real-time devices, right?
36:36So when we look at all these parameters,
36:39means it was no brainer
36:41to basically leverage the technology.
36:44And that's where we started the journey.
36:46And now, as you see that,
36:48that we have already implemented that,
36:50team is using that.
36:51And I do see that,
36:52that we started at the smaller scale
36:55and I do see that,
36:56that as we move forward,
36:58as the requirement progresses,
37:01the scale is going to be higher,
37:02much higher because our client
37:05uses the mobile technologies,
37:08especially in the field,
37:10especially in-house as well
37:12with their employees.
37:14So I see no reason why it can't expand.
37:18But as answering to your question,
37:21and those were the reasons
37:23why we choose Hexpin
37:25and it's working out well for us.
37:28Hey, thanks.
37:29And a follow-up question, right?
37:32So how difficult was it
37:35once we started up?
37:37How easy or difficult was it for all of us
37:40to get the implementation done,
37:41to get it up and running?
37:46Yeah, sorry.
37:47I'm so sorry.
37:49Yeah, so how is your experience
37:51enabling and getting onboarded on Hexpin
37:53and getting this inside the ecosystem
37:56for customers?
37:56And eventually, how did it come up?
37:58So your experience overall
38:00from a Hexpin perspective,
38:01support perspective,
38:02we would just want to hear the feedback
38:04in public for the first time.
38:08Yeah.
38:09Well, you don't want me to do that.
38:10I'm just kidding.
38:11No, it has been a pleasant experience.
38:15Yes, there were some delays
38:16and things like that,
38:18but I don't think that that was related
38:20to Hexpin.
38:20That was basically,
38:22even though we have gone
38:23through the architecture,
38:25the view and internal security reviews,
38:27one issue that actually
38:28the physical device,
38:30there are again further requirements
38:32of doing a penetration testing
38:34and things like that.
38:35So we needed to basically navigate
38:37through the client ecosystem,
38:39which took us a little bit of time
38:41in setting it up.
38:42But from the Hexpin standpoint,
38:44when we have everything that we needed
38:48or in terms of devices,
38:50in terms of device,
38:51in terms of the hardware,
38:55in terms of support,
38:56in terms of all the other requirement means,
39:00I think one area where,
39:04and I think if you guys already have it,
39:06it's just that when we jump started that,
39:11we needed to basically move
39:14much faster pace, right?
39:17And it was the requirement
39:18that we did not set the expectation
39:20in the beginning.
39:22And people's calendars
39:23always gets full, right?
39:25So I do understand that.
39:27So it took us like two, three weeks of time
39:30to get at least a few things settled,
39:31but it was expected in many ways.
39:34But overall, I think
39:35it's pretty straightforward.
39:39I think one area I would suggest
39:41or maybe it's just a feedback
39:43is that if client is basically,
39:48like especially the customer
39:50that I have, right?
39:51Where we need to align
39:52with their internal things.
39:54I think some level of self
39:57to do guidance will be more helpful to them
40:00because they want to experiment on things.
40:02But overall, I think it has been
40:04a great experience,
40:05great partnership across the board.
40:07And the customer is pleased.
40:10We are actually making
40:11a good progress with that.
40:13And things are going
40:14in the right direction for now.
40:17Hey, thank you.
40:18Thank you so much for the comments, Vishal.
40:20In fact, a couple of learnings
40:21that we have had as well
40:23are incorporated and we have
40:24made a lot of tremendous changes
40:26out of those learnings as well.
40:29Jumping onto one of the questions
40:30that Ashish has asked us,
40:32it's a very interesting question
40:33that I would like to go first
40:35and then would seek your comments as well.
40:38And then we'll open up
40:39for the audience to ask questions.
40:42The question is,
40:42can you please share some insights
40:44on how AI tools are not
40:47incapable of making mistakes,
40:49that AI should also,
40:50AI can also make mistakes.
40:52And how and, you know,
40:54can produce false positive
40:56and false negative scenarios,
40:58which may end up requiring
40:59manual validation of
41:01mobile testing as well.
41:03So, you know, I'll answer this
41:05from a coverage perspective, Ashish.
41:07One, these AIML models, right?
41:09They have been built on
41:12for many, many years.
41:14On Hedgewin, we have, you know,
41:18billions of execution
41:19happening every month basis, right?
41:21So how really AI evolves
41:22is from the learning perspective.
41:24So when you learn more,
41:25when you execute more,
41:25your hypothesis, you know,
41:27if you must have heard of,
41:28you know, so you create a hypothesis
41:33and then you execute a lot of data
41:34with the run data,
41:37then you have validation data,
41:39your test data,
41:39and then you, you know,
41:40the graph changes,
41:42the hypothesis changes,
41:43and that's the AIML model
41:44we are talking about.
41:45So Hedgewin has been, you know,
41:47helping from day one,
41:48the biggest trendsetters of,
41:51you know, experience.
41:53You know, I can just mention
41:55the likes of Metas and the Googles,
41:56you know, the companies
41:59who have been, you know,
42:00changing and reshaping the world,
42:01we have been helping them
42:02from the beginning.
42:03So we really have the experience
42:05of, you know, making our AI model
42:07very, very strong to not do that.
42:10But yes, accepted, you know,
42:12there could be chances that,
42:13you know, out of, you know,
42:15100,000 test cases,
42:16five could fail,
42:16and then that could require
42:19a manual intervention as well.
42:21But, you know, can you imagine
42:22from doing 100,000 test cases
42:24manually versus doing five manually?
42:28That's, that's what
42:29we're talking about, right?
42:30It saves you a lot of money.
42:32It saves you a lot of effort.
42:33It saves you time because,
42:36you know, if you do not have
42:38a release going out tomorrow,
42:39you might lose out the business, right?
42:41So that's what we are
42:42talking about, Ashish.
42:44And yes, there are scenarios
42:46where you may run
42:48one of your test journey.
42:49So what you're seeing on the screen
42:50is actually a normal user journey
42:52where you open up the app,
42:54select some items on the cart,
42:55and then you check out, right?
42:57AI engine is basically running
42:58all the analytics and telling you
43:00what's good or what's bad.
43:01But your test policy,
43:04which is, you know,
43:04I'll click on one and click on two,
43:06I'll click on three,
43:06and then I'll type in four,
43:07and then I'll, you know,
43:08do the transaction.
43:09That remains same.
43:10So while your pass and fail remains same,
43:13if it fails, why did it fail?
43:16And also we'll tell you
43:17how good or bad it was.
43:18So for one of your test case functional
43:20that you run,
43:21we run about 60-65 test cases
43:23on our behalf,
43:24and then tell you
43:25how good or bad it is, right?
43:28So Ashish, I hope this much
43:30must have answered your question.
43:31And, you know, anybody,
43:33now the room is open
43:34for the next 15 minutes.
43:35Anybody who has any questions,
43:37feel free to ask.
43:39And then we'll be happy to,
43:40you know, answer it on your behalf.
43:43And if somebody is posting on LinkedIn,
43:46I will request the coordinator
43:47to, you know,
43:48pose these questions here as well.
44:02So, yeah, I mean,
44:03for the next 15 minutes,
44:04please feel free to type in your questions.
44:08All right.
44:09Vishal, by the time people ask this question,
44:15do you have some questions for us?
44:17Yeah, no.
44:18What you just mentioned,
44:19what Ashish was asking earlier, right?
44:21Means there are two key things
44:24to understand here, right?
44:25Means when we talk about AI
44:27and machine learning, right?
44:29First thing is that
44:30that data part is very important.
44:34It's a garbage in, garbage out.
44:35If you have garbage data,
44:37your AI models are not
44:38going to give you the better result, right?
44:41So having a good data set,
44:43any given day,
44:45as we progress in the future,
44:47the data that you have is
44:49and how qualitative
44:51and how sanitized that data
44:52is very important.
44:54The second part is that
44:55that accuracy and efficacy
44:57of the model, right?
44:59And based on the qualitative data,
45:02how those models are trained
45:04and not necessarily,
45:06you are going to have
45:07the best results to start with.
45:09Maybe your accuracy of your model
45:11is 60%, 70%, right?
45:14The more and more you use,
45:15the more and more basically
45:17you go through,
45:19you know,
45:19training the model cycles
45:21and things like that,
45:23the better and better result
45:24you will get, right?
45:25Because it's a self,
45:26goes through that
45:27self-learned learning mechanism
45:29as well, right?
45:31So AI is not going to solve it.
45:33If you have AI
45:34and machine learning model,
45:35it doesn't mean that
45:35it does everything
45:36that you want to start with, right?
45:39Currently, we are in a phase
45:41and this is just my personal opinion.
45:43So I will not put it
45:44as the way of life,
45:45but currently I believe that
45:47that AI and machine learning,
45:49when we talk about it,
45:51we are in a segment
45:53or I would say we are in a,
45:56in that journey of that AI,
45:58where you said that
45:58it's a Nirvana state.
46:00We are in that prioritization
46:02and optimization phase, right?
46:05For example, if I need to,
46:07if let's say you have a new project
46:09and you need to generate
46:11the test cases,
46:12you use generative AI
46:14to generate the test cases
46:15based on your requirement.
46:17Now, traditionally,
46:18if you need to write 5,000 test cases,
46:21it would have taken you
46:22maybe a couple of weeks
46:23just making it up, right?
46:25Depends on how,
46:27what you need to do.
46:28Maybe it could take months, right?
46:30But if I have a better model
46:32and if I have a better
46:33written use cases,
46:36my AI model can write
46:38those test cases for me
46:39within minutes.
46:41So I save significant amount of time.
46:45Let's say that
46:46just the benchmark for that,
46:47let's say it took me a month
46:48to write 5,000 test cases,
46:51might take more,
46:52let's say two months, right?
46:54Here, theoretically,
46:57I'm basically generating
46:58those test cases in a few minutes,
47:00let's say half an hour.
47:01I got all the test cases
47:03that I needed, right?
47:05Now, you already save
47:06two months of time
47:07because things are done
47:08in half a day.
47:10You can, and I'm just making
47:12this number up as well,
47:13that let's say that
47:14this test cases you generated
47:16are only 50% valid,
47:20other 50% are not.
47:22But now you are spending
47:24your more time in reviewing this
47:26and then basically bridging the gap
47:30or making a course correction into it,
47:32which will take
47:33maybe another,
47:34let's say that 29 and a half day, right?
47:37And I'm just making,
47:39using this number for a reason.
47:41So 29 and a half days
47:43to review and correct
47:44those test cases,
47:45half a day to generate it,
47:46still a month.
47:48You already save a month of time, right?
47:51So yes, right now,
47:53you might not have
47:54the highest level of accuracy
47:56where you can say that
47:56one click and done.
47:58Now, you will get
48:00a better optimized result,
48:01better, and an AI model
48:03will actually give you a result
48:05that you have not even thought about
48:07sometimes, right?
48:08Because it sees those patterns.
48:10So it's always helpful.
48:12And as we start to keep using it,
48:15as this model gets more mature,
48:18the accuracy will continue to increase
48:21where you were spending
48:22a 29 and a half days.
48:24Maybe next time you will spend 25,
48:27next time you will spend 10, 20,
48:29and it just keeps improving that.
48:32I hope that that answers your question.
48:42Yeah, absolutely.
48:45That was a point.
48:47We have a couple of more questions.
48:50One is, how does HPS platform
48:52differentiate itself
48:52from other mobile testing solutions
48:54already in the market?
48:58That's a question to me,
48:59rather than to you,
49:01but you being a customer,
49:03I'll be happy to hear
49:05your opinion out as well.
49:07So, positioning perspective,
49:10from a feature comparison,
49:11from a deployment comparison,
49:12we do on-premise,
49:15we do on cloud.
49:17We have more number of data centers
49:18than anybody else today.
49:20So we operate in about 65 data centers
49:22around the globe.
49:23We have the much more stable platform
49:27in terms of device availability,
49:29making sure that your runs are smoother.
49:31And then we have something
49:32which is the smart analytics
49:33that we have been just talking about, right?
49:35So the differentiator is the AIML models,
49:38AI detecting issues on your behalf,
49:40while your tester
49:41is only doing the functional testing.
49:43That's what we differentiate in.
49:49If you just want to think about,
49:50how are we differentiator,
49:51then we are basically device farms
49:55and testing and AIML on steroids
49:58all together while
50:00we don't do one thing,
50:01we do multiple things.
50:03That's how we do it.
50:05Another interesting question
50:07that has come is,
50:08how do you see the future
50:09of mobile testing evolving
50:10with advancements in AI
50:12and machine learning?
50:13Yes, just the way Vishal talked about, right?
50:15So we saw how AI has been used
50:22to identify issues.
50:23There have been self-healing scripts.
50:25There have been a lot of change
50:28that is happening towards the mobile space.
50:30Earlier, the website and the dot coms
50:32for the priority market,
50:33the thick applications
50:35and the packaged applications
50:38for the market,
50:38the SAPs and the Salesforce
50:39for the markets,
50:40but all of this is now coming in mobile.
50:42Now with all the consumption
50:44coming in mobile,
50:45there will be a lot of focus
50:46of different players out there,
50:50shaping up how mobile
50:51could be more web app,
50:53more lighter,
50:55could load up absolutely faster
50:58the consumption could be much, much better.
50:59And then with the new technologies
51:02of image generation coming in picture,
51:05how it has been shaping up,
51:07we will have to do a lot of testing
51:09and we have to be more responsible
51:12with AI and ML, right?
51:13Because there are a lot of deep fakes coming in.
51:15We have to be more cognizant
51:16of all these things.
51:18So yes, the future is something
51:20that nobody can predict from today.
51:22This is what I would say,
51:23but it's going to be beautiful.
51:25That's what we believe in.
51:27I think Rajeev has asked a question.
51:29How do we typically justify investments
51:32from intelligent AI-backed solution
51:33to a client who is...
51:38Yeah, I have to scroll a little bit.
51:43Who is their initial maturity stage?
51:47Yeah, so that's a great point
51:51to speak up Rajeev,
51:53but I'll just try to cover that up here.
51:56You know, to answer that question again,
51:58how do we differentiate this?
52:00Every company will be
52:01in a different maturity model.
52:02Some would be just building up the application.
52:04Some could be just building up
52:05the automation regression suit.
52:08Somebody would be in a different stage
52:10to make experience a priority.
52:14So there are some intelligent companies
52:16which do all of this hand-in-hand.
52:19They make sure their priority is experienced
52:22while they do other things as well.
52:24So this is how we come in picture.
52:25We are in every different stage of maturity model.
52:28We are the platform of hell.
52:31Though when you are just developing the scripts,
52:33we are the right platform to make sure
52:35that we let you run the automation,
52:38but also be very cognizant
52:40of the experience side of equations.
52:43From an investor's perspective as well,
52:45just think about, right?
52:46Any retail company today would be spending,
52:49you know, a decent size enterprise retail
52:51would be spending, you know,
52:54in double digits of millions every year
52:57because that's the least they have to spend.
53:00While what we do is,
53:00we move their CapEx to an OpEx, right?
53:04So a lot of bug detection happens very quickly
53:07since they do not have to invest
53:10a lot of the inventory, people, regions,
53:13creating different labs.
53:14One of the examples I talk about, right?
53:17There's a retail customer in the US.
53:19Again, you know, Capgear and I,
53:20and HedgePen has this common customer.
53:24So they had a program
53:25where they wanted to launch 600 retail stores.
53:28All 600 retail stores,
53:30you would be using mobile POS solutions, right?
53:33So that's a huge program.
53:36And how HedgePen came in
53:37as one of the priority solutions
53:39was to do it faster, do it cheaper,
53:42and make sure that, you know,
53:44from a services component, you know,
53:46it saves them a lot of money.
53:48So we were not just a device farm to them.
53:51We were chosen solution to make sure that,
53:53you know, 600 labs, they cannot test, right?
53:57They will have to create a simulated lab
53:59which will have the right environment.
54:01We created a mini device farm
54:04within their own premises.
54:05All the testing was done remotely from India.
54:07And then the whole mobile program
54:09ran for two years
54:10and they were able to launch all 600 labs
54:13with mobile application out there with HedgePen.
54:16This is the power of HedgePen we are talking about.
54:17We're not talking about only Android.
54:19We're not talking about only iOS.
54:20We're talking about POS devices in place
54:24where, you know,
54:25you get very less number of developers
54:26then, you know, the adaptability is very different
54:29and not all players in the market,
54:31you know, no player in the market,
54:33in fact, does Zebra device testing as good as we do.
54:38So this is how we differentiate ourselves.
54:43One other question, Vishal, I think,
54:45you know, how do we support the UI UX testing?
54:48Again, a question for me,
54:50but would you want to chime in?
54:51Would you want to take this one?
54:52Because you have seen so much of demos from me.
54:54Would you want to take this up?
54:55Yeah, I mean, it just depends on the technology
55:00that you are using, right?
55:02When you talk about UI and UX testing, right?
55:07Means HedgePen provides the infrastructure
55:10and environment where you can do the testing,
55:12but which other technology you are using
55:16in collaboration with HedgePen to support that,
55:18that's critical.
55:19In our case, we use Tricentris Tosca
55:22to basically do that,
55:25which has basically the combination
55:27has worked very well for us.
55:29So it just, you need to figure out
55:31what your customers needs are,
55:33what their priorities are,
55:34what tools they have in their ecosystem.
55:37If they don't already have it,
55:39then what basically their appetite for investment is,
55:43their appetite for whether using a open source
55:46versus the commercial products and things like that.
55:49So there are lots of considerations
55:51that you need to put.
55:53And similar to like,
55:55when you talk about this consideration,
55:57what Rajeev was asking earlier, right?
56:01When you talk about an investment,
56:03it again kind of aligns with a similar thought process
56:07or similar basically things that you need to explore
56:10that what are the tangible and intangible benefits
56:13that I'm looking for
56:14or tangible and intangible basically value
56:18that I want to capture out of this, right?
56:20For example, tangible mean,
56:23what is my IRR ROI, right?
56:26What basically ROI I'm going to get
56:28out of doing this investment in this specific project,
56:31but then intangible will be that,
56:33that how it allows me to support my business goals,
56:36how it allows me to basically achieve a better result
56:40for my annual business goals
56:41that I have across the board and things like that.
56:44So it's just, you need to create a right business case
56:48with right justification
56:50where you can showcase all this basically benefits,
56:54return of investment or the additional benefits.
56:58And once you have the right story,
57:00it becomes much easier to get the approval
57:02because if you have a right level of ROI,
57:06I don't think that that leadership
57:08will have any reason not to approve it.
57:12Absolutely.
57:14So Vishal, this brings us to the last minute of the session.
57:18And I did not realize that we have just hit an hour,
57:22such an interesting conversation.
57:24And we have had experience of working together.
57:27And it is always amazing to speak to you, Vishal,
57:33listening to your insights
57:34and listening to your user stories,
57:36your customer stories.
57:38And it's like an institution.
57:41I mean, moreover, you're like a mentor to me
57:43as well as with HedgeFund,
57:45you're one of the guiding stars to us in the partnerships.
57:49Thank you so much for all the support
57:52and trust in the technology that we have built up.
57:55And we'll be keen on listening to your feedbacks
57:59more and more often, Vishal.
58:01So to you again for a closing statement, Vishal,
58:04we want to conclude this
58:05so that we can just let the people go
58:08and then we can close the session.
58:11But we would want to close it with your closing terms
58:14rather than mine.
58:15No, so first of all, thank you for inviting me
58:18and thank you everyone who has joined the call today.
58:22It's a very interesting area
58:25and I think that it's not going anywhere.
58:28It's going to keep growing.
58:29I think everybody should,
58:31if the mobile devices is part of their business,
58:34they need to physically start
58:37putting together a strategy and plan for that.
58:40I know we had an hour.
58:42If anybody wants to chat any questions they have,
58:44I'm happy to answer offline or connect.
58:46Please do reach out, happy to do so.
58:49But again, thank you, Kishan, for inviting me
58:52and for this wonderful session.
58:57You know, I'm so humbled right now
58:59that it's a pleasure for us to have you, Vishal, here.
59:04Thank you so much, you know, Vishal.
59:06Thank you so much, audience.
59:07Thank you so much, everybody,
59:09spending an hour with us,
59:11having all these conversations.
59:13And we would love to come back with a couple of,
59:15you know, quick, good stories with a lot of names
59:18which we could not take today.
59:20But we'll bring a lot of, you know,
59:23stories to you moving forward.
59:26All right.
59:27Thank you so much, everyone, for joining in.
59:29This is Kishan Tiwari
59:30and you can reach out to, you know, me and Vishal
59:32anytime on LinkedIn
59:33and we'll be happy to guide you and help you out.

Recommended