enterprise

Innodata Inc. (NASDAQ:INOD) Q2 2023 Earnings Call Transcript – Yahoo Finance


Innodata Inc. (NASDAQ:INOD) Q2 2023 Earnings Call Transcript August 10, 2023

Operator: Good day everyone. And welcome to Innodata’s Second Quarter 2023 Earnings Call. At this time, all participants are in a listen-only mode. A question-and-answer session will follow the formal presentation. [Operator Instructions] It is now my pleasure to turn the floor over to your host, Amy Agress. Ma’am the floor is yours.

Amy Agress: Thank you, Mathew. Good afternoon, everyone. Thank you for joining us today. Our speakers today are Jack Abuhoff, CEO of Innodata; and Marissa Espineli, Interim CFO. We’ll hear from Jack first, who will provide perspective about the business, and then Marissa will follow with a review of our results for the first quarter. We’ll then take your questions. First, let me qualify the forward-looking statements that are made during the call. These statements are being made pursuant to the Safe Harbor provisions of Section 21E of the Securities Exchange Act of 1934 as amended, and Section 27A of the Securities Act of 1933 as amended. Forward-looking include, without limitation, any statements that may predict, forecast, indicate or imply future results, performance or achievements.

These statements are based on management’s current expectations, assumptions and estimates and are subject to a number of risks and uncertainties and including, without limitation, the expected or potential effects of the novel coronavirus pandemic and the responses of government of the general global population, our customers and the company there to impacts from the rapidly evolving conflict between Russia and the Ukraine; investments in large language models that contracts may be terminated by customers; projected or committed volumes of work may not materialize; pipeline opportunities and customer discussions, which may not materialize into work or expected volumes of work; acceptance of our new capabilities, continuing Digital Data Solutions segment reliance on project-based work and the primarily at-will nature of such contracts and the ability of these customers to reduce, delay or cancel projects; the likelihood of continued development of the market, particularly new and emerging markets that our services and solutions support; continuing Digital Data Solutions segment revenue concentration in a limited number of customers, potential inability to replace projects that are completed, canceled or reduced; our dependency on content providers in our Agility segment; a continued downturn in or depressed market conditions; changes in external market factors; the ability and willingness of our customers and prospective customers to execute business plans that give rise to requirements for our services and solutions; difficulty in integrating and driving synergies from acquisitions, joint ventures and strategic investments; potential undiscovered liabilities of companies and businesses that we may acquire potential impairment of the carrying value of goodwill and other acquired intangible assets of companies and businesses that we acquire; changes in our business or growth strategy; the emergence of new or growth in existing competitors, our use of and reliance on information technology systems including potential security breaches, cyber attacks, privacy breaches or data breaches that result in the unauthorized disclosure of consumer, customers, employee or company information or service interruptions and various other competitive and technological factors and other risks and uncertainties indicated from time to time in our filings with the Securities and Exchange Commission, including our most recent reports on Forms 10-K, 10-Q and 8-K and any amendments thereto.

We undertake no obligation to update forward-looking information or to announce revisions to any forward-looking statements except as required by the federal securities laws, and actual results could differ materially from our current expectations. Thank you. I will now turn the call over to Jack.

Jack Abuhoff: Amy, thank you. Good afternoon and thank you for joining our call. I am tremendously excited to have announced today that we landed yet another of the big five global tech companies as a new customer to help it train and scale generative AI and large language models. These, of course, are the technologies behind ChatGPT and stable diffusion, technologies that have caught the world by a storm and set off, what we believe, will be the next revolution in computer technology. We expect our formal agreement with this new customer to be signed tomorrow. With today’s deal announcement, we are proud to say that we have won each and every one of the potentially transformative pipeline deals we spoke about last quarter, and as a result of these wins, we are poised to support AI and large language model development for four out of five largest tech companies in the world.

These wins, including the deal we are announcing today, have all come in just the last eight weeks, so they are not yet reflected in our financial results. We believe that these wins are potentially transformative for our company. Moreover, we believe that we are in the very early stages of exploiting a market opportunity, which itself is in its very early stages and for which as our last eight weeks of wins demonstrate we believe we are particularly well suited. In today’s call I’m going to provide some updates and additional perspective around today’s announcement and the other deal announcements we made in the last eight weeks. I will then discuss the opportunity landscape we see forming around generative AI technology and how we intend to exploit it.

And lastly, I’ll review our Q2 results and provide some forward guidance. Let’s start with the deal announcements. In the last eight weeks, we made four deal announcements including the deal we announced today. Two of the deals are with an existing customer and two are with two new customers, all of whom are among the top five global technology companies in the world. I’ll briefly review each of these starting with today’s announcement and then working backwards in reverse chronological order. For the new customer we’re announcing today we expect to be providing an array of services required to build and scale large language models. We expect to begin ramping up the engagement early in the fourth quarter. Importantly, the agreement we expect to sign tomorrow with this new customer is a framework agreement that enables the customer’s business units to easily add additional programs and allocate additional scope and spend to us.

Readers Also Like:  Stock market today: Nasdaq surges 2% as Big Tech leads rally after earnings - Yahoo Finance

The customer has told us that it will be potentially requesting us to participate in additional programs which may include supporting the customer’s own customer base with model fine tuning and integration. Under the agreement, we expect the customer to authorize $3.5 million in spend to get us started, and the customer has stated that it intends to supplement this authorization as we move forward. It is shared with us its vision for the initial program, which if fully realized, we believe could potentially result in approximately $12 million of new quarterly revenues at maturity. That said, at this point we do not know when or if the initial program will reach this level of spend. In addition, while the customer has told us we will be signing tomorrow, it is possible that this could slip as it is outside our control.

On July 18, we announced having won another new customer, another of the top five global tech elites. Similarly, we will be helping this new customer develop and train large language models. The customer shared with us that it had been extremely impressed with our pilots and selected us from a field of 17 competitors. The agreement we signed with the second new customer provides for up to eight million in spend for the remainder of 2023 under an initial program. Although how much of this we recognize is dependent upon a number of factors, including how quickly we ramp up. Ramp up aside based on discussions with this customer, we believe we may potentially get to an annualized run rate of $15 million with this customer by the end of the year solely on this initial program.

The agreement we signed with this customer by the end of the year solely on this initial program. The agreement we signed with this customer is also a framework agreement, which enables its business units and product centers to easily assign new programs and scope to us. We have already begun discussions with this customer regarding new programs. On June 27, we announced that an existing Big Five customer had selected us to perform AI data annotation and the LLM fine-tuning as a white-labeled service for its cloud and platform customers. We view this as a particularly strategic win as we believe that it gives us the potential ability to scale LLM services for enterprises in a way that would’ve likely been impossible working solely under our own brand and with our own sales and marketing.

Driving distribution under the customer’s brand is a uniquely attractive opportunity, the strategic value of which I will talk about more in a few minutes. The engagement builds on our 18 month relationship with this company, under which we are told we have distinguished ourselves based on data quality, responsiveness, and agility. While we stated in our June 27 press release that we expected to kick off the program with three end customers in July, I’m pleased to report that we already have nine end customers signed on for pilots and we’re not even halfway through August. We believe three of these pilots are likely to turn into book business near-term with one having an anticipated booking value of over $1 million. Beyond these nine white-labeled enterprise pilots, we presently have a direct pipeline of enterprises with which we are now or soon to be engaged in discussions about LLM fine-tuning and integration.

Our direct pipeline includes a large legal information company, one of the largest life insurance companies in the world, a leading investment bank and a leading commercial bank. In the near-term, through this program with our customer, we anticipate a cadence of one to two new pilots each week. While it is difficult to forecast the revenue opportunity this program represents because it is so new, if we can continue to onboard from the potential pool of tens of thousands of our end customers at the pace we’re executing currently, we believe this opportunity could dwarf any single initiative we’re now pursuing. On June 14, we stated that an existing Big Five customer had engaged with us for its LLM build program. In that announcement, we stated that we anticipated potentially exceeding $8 million in revenue this year with this customer.

This is up from approximately 3 million last year. We believe there could potentially be considerable opportunities to expand existing programs with this customer and to land new programs. The June 27 announcement I just spoke about regarding the white-labeled service is an example of such a new program. I would like to now discuss our opportunity landscape and to contextualize these wins within a strategic framework. First, we believe that a technology revolution is before us, that it is in its very earliest of days that it will drive profound changes in the way people work and engage with technology. Looking back, we initially became convinced that we could gain a toehold in the emerging generative AI market by pivoting off our experience creating high quality domain-specific data.

In our seven years experience in training AI models, we selected the Big Five as our initial target market because we predicted that they would be early adopters and would be spending aggressively. They all have products including search, ad serving and productivity tools that we believe stand to benefit directly from generative AI and several are hyperscalers that would seek to provide the high performance infrastructure for training and serving a range of generative AI models. Fortunately, based on what we’ve seen to-date, we believe our assessment was accurate. In their Q2 analyst calls, several of the Big Five tech companies refer to generative AI and LLMs as at the core of their innovation plans, backed by commitments for billions of dollars per year of capital expenditures.

Readers Also Like:  Maximizing Customer Engagement with Dynamics 365: Best Practices and Tips

At least one prominent firm stated that every one of his businesses had multiple generative AI initiatives. We believe it is quite clear that these companies are now in a race to build cutting-edge LLMs that can serve as a foundational layer of generative AI to serve their products as well as their ecosystem customers. It is early days. One company described it today, or one company described it as today being just a few steps into a marathon. For the Big Five tech companies, we expect it to be performing a range of data engineering work required to build cutting-edge generative AI. We expect this to potentially include collecting data, creating data sets used to train the models and teach the models to follow instructions and chain of thought reasoning, providing reinforcement learning a process by which we align models with human values and complex use cases, and providing red teaming and model performance evaluation.

Image by Akela999 from Pixabay

Over time, we anticipate working with these companies to address multiple languages, multiple data domains, and multiple data modalities including video. Given that, we have now successfully penetrated four of the five largest technology companies, we believe we are poised to potentially drive significant growth in the market in the near, medium and long-term. It is worth mentioning that beyond the Big Five, there are another 50 to 100 technology companies that we estimate are now building or likely to be building their own LLMs. This includes some prominent well-funded startups. We intend to build relationships with these companies as well. I mentioned a few minutes ago another compelling opportunity for us with the Big Five, establishing white-label programs under which they can provide our services to their customers under their brand.

We’ve now got such a deal in place with one of the largest type of hyperscalers of the world, and we’re exploring similar opportunities with others. We view the white-label arrangement as a particularly strategic opportunity for three reasons. First, given that this customer is one of the largest global cloud providers, we expect many thousands of customers and partners to utilize their infrastructure for generative AI. These customers will likely find a one-stop shop for all things AI to be an attractive value proposition. They will be able to obtain compute, storage, foundation models, machine learning, database and data services, basically everything required to train and serve generative AI all under one roof and at an attractive price point.

Second, in mid-market companies such as ours, growth is typically constrained by the size and scale of the sales and marketing effort, but these white-label programs offer us a means of potentially scaling independent of our own sales and marketing, leveraging both their brands and their customer reach. Lastly, as a result of these programs, we believe that we’re likely to gain early exposure to a wide range of early adopter generative AI use cases. Under our initial white-label program, we have recently piloted use cases ranging from call center summarization, legal and medical question answering and e-commerce. We believe this exposure will set us up well for what we believe will be our largest and most significant opportunity LLMs for the enterprise.

We believe that a decade from now, virtually all successful businesses will be AI companies, and that between now and then the pace of adoption will be dramatically accelerating. We believe there are essentially two reasons for this. First, LLM, early adopters are likely to be significantly more productive and to create significantly more compelling customer experiences. Thus, there is likely to be strong competitive pressure accelerating the adoption curve. In the June study, McKinsey estimated that generative AI could add trillions of dollars in value to the global economy with half of all of today’s work activities automated sometime between 2030 and 2060 with the midpoint of 2045. Secondly, the tech stack will be ready. High performing commercially usable open source generative AI models will become increasingly available and the best performing closed source models will likely support fine tuning on proprietary data.

We anticipate that this will make it possible for companies of all sizes to customize their own large language models and to build generative AI applications to secure an enterprise grade fashion. As the enterprise market accelerates, we believe our capabilities will be increasingly in demand. Our plan is to exploit the enterprise opportunity with what we believe are five distinct competitive advantages. First, the skills and referenceability we will have acquired helping the Big Five, build the very foundation models with which enterprises will then be seeking to integrate. Second, experience, we will have gained across a wide range of use cases, working with early adopter enterprises through our white label derangement with the hyperscalers.

Third, the real world experience we continue to gain in integrating both classical and generative AI into our own operations and our own products. Fourth, our technology platforms for transforming enterprise data into LLM ready context for both model fine tuning and prompt injection. And fifth, our technology platforms, both existing and roadmap for development that will help enterprises generate reliable fact-based responses and insights from foundation models using techniques such as retrieval augmented generation, or RAG for short, that combine the reasoning and language engines of pre-trained LLMs with a business’s proprietary context data, unstructured data like technical documents, images, videos and reports, as well as structured content from enterprise systems and sensors.

Readers Also Like:  Irish Head Back to North Carolina for ACC Series with the Wolfpack - Notre Dame Athletics

In addition, our strategy is to harness AI and LLM technology and specific workflow applications. For example, in late 2021, we announced a multi-year agreement with one of the world’s largest banks to use AI to re-engineer workflows related to scanning regulatory updates. This effort is now well underway and is being enthusiastically received by the bank. It holds the promise of improving regulatory change management, resulting in reduced fines and penalties, while also reducing the heavy lifting done by people today. Under the arrangement, the bank pays us an annual subscription fee. We’re now in discussions with three other companies about this product. Let’s now talk about what to expect financially as a result of our new wins. We are expecting to begin work on the win we announced today in early Q4.

On the win we announced on July 2021, we kicked off work about two weeks ago. Based on our experience engagements start slowly. First, we work with the customer to create detailed specifications and run pilots to ensure that the specifications are yielding the intended results. Oftentimes, this requires several iterations. Once the specification is locked down, we then put in place the required infrastructure. This includes custom configuring technology systems, finalizing process designs and assembling human resources, data engineers and subject matter experts. This can take two to three months. After this is completed, we scale up. We typically scale up slowly so that we can continue to test and refine as necessary with the customer. We expect our programs to be dynamic with customer dependencies and checkpoints throughout, which makes it tough to do quarterly forecasting at this point.

But based on our experience, the end result that gets us to a full ramp up is typically achieved in a roughly 12 month period. Finally, let’s talk about our Q2 results and guidance. In Q2, revenue was $19.7 million, a 4% increase from Q1 and adjusted EBITDA was $1.6 million, 100% increase from Q1, which we believe was made possible by the work we did late last year and into Q1 and sharpening our focus and finding opportunities to operate more cost effectively. Please note that there was no revenue in the quarter from the wins we discussed today. It is also worth noting that there was no revenue in the quarter from the large social media company, which contributed $2.5 million in revenue in Q2 of last year, but dramatically pulled back spending in the second half of last year as it underwent a significant management change.

If we back out revenue from this large social media company, our revenue growth in the quarter over Q2 2022 would’ve been 13%. We expect that revenue and adjusted EBITDA growth to accelerate from here in the ensuing quarters, both sequentially and year-over-year as the wins we’ve discussed today start wrapping up. We ended the quarter with $13.7 million in cash and short-term investments up from $10.8 million at the end of Q1. We continue to have no appreciable debt. In order to support our growth and working capital requirements in the quarter we put in place a secured revolving line of credit with Wells Fargo that provides up to $10 million of borrowing subject to a borrowing base limitation. I’ll now turn the call over to Marissa to go over the numbers a bit more, and then we’ll open the line for questions.

Marissa Espineli: Thank you, Jack. Good afternoon, everyone. Allow me to recap the Q2 2023 financial results. Revenue for the quarter ended June 30, 2023 was $19.7 million compared to revenue of $20 million in the same period last year. The comparative period included $2.5 million in revenue from a large social media company that underwent a significant management change in the second half of last year. As a result of which it dramatically pulled back spending across the board. There was no revenue from this company in the quarter ended June 30, 2023. Net loss for the quarter ended June 30, 2023 was $0.8 million or $0.03 per basic and diluted share compared to a net loss of $3.8 million or $0.14 per basic and diluted share in the same period last year.

Revenue for the six months ended June 30, 2023 was $38.5 million compared to $41.2 million in the same period last year. Again, the comparative period included $6.9 million in revenue from the large social media company referenced above. There was no revenue from this company in the six months ended June 30, 2023. Net loss for the six months ended June 30, 2023 was $2.9 million or $0.11 per basic and diluted share compared to a net loss of $6.6 million or $0.24 per basic and diluted share in the same period last year. On adjusted EBITDA. Adjusted EBITDA was $1.6 million in the second quarter of 2023 compared to adjusted EBITDA loss of $1.3 million in the same period last year. Adjusted EBITDA was $2.4 million in the six months ended June 30, 2023 compared to adjusted EBITDA loss of $2.3 million in the same period last year.

Our cash, cash equivalents and short-term investment were $13.7 million at June 30, 2023 as compared to $10.3 million at December 31, 2022. Again, thanks everyone. I’ll turn you over to Matthew. We are now ready to take questions.

See also 30 Funny Debate Topics Like Is Water Wet and 20 Best Makeup Products to Buy.

To continue reading the Q&A session, please click here.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.