During the past few weeks, tech companies have dominated the news, from a massive slide in share prices to Elon Musk’s ‘will he, won’t he’ Twitter dance. Overlooking all the drama, I want to shine a spotlight on one piece of news that really caught my eye. We have all heard of Unity Software, aka the developer of Unity – the real-time development platform used in many industries, and probably the best known and fastest-growing game development platform over the last few years.
Recently, Unity Software saw its stock plummet nearly 40% following its Q1 earnings announcement, resulting in a $5 billion market cap free fall, while also reporting losses of nearly $110 million for 2022. Now, this could have easily been overlooked and attributed to the overall market’s downward trend, but digging a little deeper into the earnings call, we find the two primary reasons behind the sudden drop in market share quite interesting.
Two AI faults that Cost Unity $5 Billion
First, there was a major fault in their Audience Pinpointer tool. This tool generates a real-time valuation of each user during a specific ad request, enabling game developers to monetize ads by targeting the right audiences for their games. Based on the earnings transcript, the fault caused a drop in the accuracy of the ML model’s prediction, which led to a drop in the use of the tool.
This is what John Riccitiello — President, Chief Executive Officer, and Executive Chairman – called a “self-inflicted wound” in an interview on “Mad Money”. The second issue that plagued Unity Software was the loss of value of a portion of their training data, due mostly to ingesting bad data from a large customer. These faults were key drivers behind Unity Software’s massive $5 billion market cap loss. This unfortunate event further highlights the impact that ML models have on society and more notably on businesses across industries.
The Reality of Working with ML
Despite the obvious challenges, we are living in exciting times, as data scientists and ML engineers innovate and alter the way we live and work. These game-changing models are pushing humanity into a more advanced, fast-paced world, which quickly impacts an organization’s business goals for good and for bad.
Even though Unity Software’s ML ops may not have been a cataclysmic event, it still caused a major setback for the company, as illustrated by the financial loss and the resources that are now needed to fix the issue. According to Morgan Stanley analyst Matthew Cost: “While the core issues are now resolved, it will take time to retrain the machine learning algorithms and win back ad spend that migrated away early this year.”
The Need for ML Monitoring
This case is added to a growing list of AI failures with another example being last year’s Zillow debacle. In that scenario, the company applied ML models to evaluate houses, and then flip them, until COVID-19 changed the real-estate landscape. Cases like Unity and Zillow, raise, again and again, the necessity of continuous monitoring of the data fed into the ML models and the predictions that those models generate.
Prior to the widespread adoption of ML models as part of the core business, the analytics and observability teams were able to alert on any changes in product usage that may impact business KPIs. The insertion of ML models adds a new and more complex layer, elevating the need for a continuous model monitoring mechanism to achieve the same level of observability as before.
Other than the obvious need to monitor for both data and prediction drift, another lesson we can learn from this event is the need to segment your data as part of monitoring your data’s behaviour. If a portion of your data changed against your entire dataset, this may indicate a flaw in your data pipeline.
Data scientists and ML engineers know the work top to bottom, but in addition to building and training ML models, the ever-changing nature of reality requires them to constantly detect issues, alert stakeholders, and explain changes in their models’ behaviour. This is why they need comprehensive ML Observability tools to enable them to monitor and explain changes in performance as quickly as possible before they impact the business and their customers.
Written by Nimrod Carmel, Senior Software Engineer, Aporia
Feature image credit: Matan Kidrony