Skip to main content

Written by Bhumil Soni

Helly: “Hey, machine learning is changing everything!”

Reeaen: “Really? So my coffee machine will finally understand ‘extra foam’?”

Helly: “Uh, not exactly, but your phone might!”

For the longest time, I was Reeaen, puzzled by all the machine-learning (ML) buzz. Recently, I’ve managed to slip into Helly’s shoes, excited about the real possibilities that machine learning offers, especially in the realm of mobile apps.

So, who is this reading for? If you’re a developer, a business stakeholder, or simply someone intrigued by the possibilities of ML in mobile apps, then welcome! I aim to provide practical insights without diving too deep into jargon or technicalities.

When it comes to leveraging ML into your app, you generally have two routes:

On Server

Rely on a server, which means your users will have a loading indicator as their coffee companion, while they wait for a response for, let’s say, image recognition or chatbot query.

On Device

Make your app self-reliant and smart enough to handle tasks on the device. This could translate to quicker face recognition or language translation right on the user’s device. (They might still be sipping their coffee, but at least they would get their results before they take the next sip.)
We’ll be focusing on the second route, discussing why it exists and how to go about it. Think of it as a user-friendly guide, less ‘rocket science’ and more ‘intelligent toaster manual.’

Before we dive in, I have added a quick navigation below. If there’s a specific section you’re interested in, feel free to jump right to it.

  • The What: A simple introduction to machine learning.
  • The Why: Reasons why on-device machine learning is beneficial.
  • The How: An overview of how to leverage Machine Learning in Android app, explaining ML Kit and TensorFlow Lite.
  • The Choice: A comparison between ML Kit and Tensorflow Lite to help you make an informed decision.
  • Where To Go From Here?: Next steps to take in your Machine Learning journey.

The What

Ok, all is good but what on earth is Machine learning? In layman’s terms, it is about teaching computers to make decisions based on data rather than explicit instructions. Unlike traditional programming, where we write specific rules for the computer to follow, ML allows the computer to discern patterns, learn from it and then make decisions. And there are various ways this can be done. While we won’t get into its nuts and bolts here, if you’re curious to learn more, I highly recommend this blog post which covers the topic nicely. But for now, let’s continue with how you can turn your mobile app into a genius!

In the context of mobile applications, leveraging ML means enhanced user experiences, predictive functionalities, and more efficient processes, effectively making smarter apps. So, in essence, we’re talking about turning your mobile app from a static tool into a dynamic assistant. Isn’t that something? So, let’s dive in on why and how to turn your android app into an ‘intelligent toaster’. But, hold on, before that, just a quick note on the brain of the ML – the “model”.

What is a Model?

Think of a model like a recipe in a cookbook. The recipe takes in ingredients (your data) and tells you how to combine them to get a final dish (a decision). Before a recipe is shared, it’s tweaked and tested; models go through a similar “training” and “testing” phase. Just like you can have different recipes for different dishes, you can have different models for different kinds of decisions.

The Why

On-device machine learning gives your app the capability to process images, sound, and text, right on the user’s device. So, why would you want to do that instead of letting a server handle it? Here are some compelling reasons:

  • Real-time processing: With ML models residing on the device, data can be processed in real-time without the need for internet connectivity. This ensures that your app’s intelligent features work smoothly even in the areas with low or no connectivity!
  • Reduced Latency: Server-based tasks usually involve some delay. On-device processing can drastically cut down these wait times, giving user almost instantaneous response. As the saying goes, “Time is money”, or more in this case, user experience.
    Data Privacy: Processing data on the device eliminates the need to send user data back and forth, offering an added layer of privacy and security.
  • Cost-effective: Fewer server interactions means lower server costs, which is especially beneficial when your app requires heavy data processing.

If these are enough reasons for you to consider the on-device route for your awesome Android app, stay tuned as we’ll explore the different ways ML can be integrated into your app in the next section.

The How

There are two primary ways ML can be integrated in your android app, with each having its own set of strengths, tailored for specific needs:

  1. The ready-to-use ML Kit
  2. Do-it-yourself (kind of) with TensorFlow Lite.

ML Kit: Ready-to-use ML

ML Kit is a mobile SDK provided by Google, backed by their own machine learning models. It’s a free-to-use, plug-and-play solution that’s designed to make the integration of machine learning into your app as straightforward as possible. It comes pre-packaged and is easy to integrate, making it an excellent choice for quickly adding intelligent features to your app without getting tangled in the complexities of ML. Yes, that’s right, you don’t need to know the first thing about ML to get started!

What does MLKit offer?

MLKit offers a wide range of capabilities, categorised into two main groups: Vision and Natural Language APIs. We will go through each API available at the time of writing in brief to understand its use-cases.

  • Vision APIs: These APIs provide a broad spectrum of functionalities related to image and video recognition.
    API Name Description Ideal Use Cases
    Face Detection API Detects faces in images or videos and identifies facial features like eyes, ears, and mouth. Enhancing user experience in apps requiring face-based interactions, such as generating an avatar from a photo.
    Face Mesh Detection API (Beta) Generates a 3D mesh to map faces in selfies or video calls. Optimised for close-up shots within 2 meters. Detailed AR effects and real-time facial geometry apps.
    Text Recognition v2 API Recognises text in various scripts. Automating tasks like scanning business cards or documents.
    Image Labeling API Detects and labels over 400 categories in images. Photo management apps.
    Object Detection and Tracking API Identifies and tracks objects in images or live streams. Real-time tracking applications like automated surveillance systems for security.
    Digital Ink Recognition API Converts handwritten text and sketches to digital format, with support for over 300 languages. Apps requiring unique character input or creative drawing features.
    Pose Detection API (Beta) Detects human body posture in real-time. Fitness apps, motion analysis, and augmented reality experiences.
    Selfie Segmentation API (Beta) Separates subjects from the background in selfies. Adding effects or unique backgrounds.
    • Natural Language APIs: These handle a variety of text-based functionalities.
      API Name Description Ideal Use Cases
      Language Identification API Identifies the language of a given text string from over a hundred languages. Platforms hosting user-generated content.
      Translation API Provides on-device text translation among over 50 languages. – Powered by the same models used by the Google Translate app’s offline mode. Chat applications for quick, casual translations. Enhancing accessibility in customer service settings for non-English speakers.
      Smart Reply API Suggests contextually relevant responses to messages in English. Quick communication on platforms with limited input capabilities.
      Entity Extraction API (Beta) Identifies specific types of data in text. Apps requiring contextual actions based on content.

      For those thinking about leveraging APIs features in iOS, you’re in luck since all of these APIs except the Face Mesh Detection API are available for iOS as well. One other great thing offered by MLKit is dynamic model downloading via Google Play Services to optimise app installation size for selected APIs. For those interested in a deeper dive into all of these APIs, you can find more detailed information on the Google MLKit Developer’s Guide. Google have also provided collection of samples demonstrating the use of each ML Kit API, for both android and iOS.

      TensorFlow Lite: Custom ML

      Let’s start with the big picture: TensorFlow is an open-source toolkit developed by Google that offers everything you need for a wide range of machine learning applications—from building and training to deploying complex models. But what if you need a more streamlined, portable version? That’s where TensorFlow Lite comes in.

      TensorFlow Lite is designed to bring machine learning to mobile devices efficiently. It has the capability to do everything that MLKit APIs can do—and things they can’t! For instance, you could use it to identify specific dog breeds in images, a custom use case that could be useful in a pet adoption app. Also, TensorFlow Lite even supports on-device training of the models, a pretty neat feature for more advanced uses. Think of it as a handy, portable toolkit that is optimised for speed and low-resource usage, and it also supports hardware acceleration for even faster performance.

      Using TensorFlow Lite typically involves two steps:

      1. Generate a TensorFlow Lite Model: You can either use an existing TensorFlow Lite model, modify one to suit your needs, or build a TensorFlow model from scratch and convert it to TensorFlow Lite format. This ties back to our ‘recipe’ analogy—you can either use a standard recipe, modify it, or create your own.
      2. Run Inference: This is where your model performs the tasks it was trained to do, such as classifying images or recognising text.

      Alright, let’s get a bit more technical—but not too much, promise! TensorFlow Lite offers two types of APIs:

      • The TensorFlow Lite Task API is beginner-friendly and designed for simple tasks such as object identification in images. Similar to MLKit APIs, it offers Vision, Natural Language, and Audio APIs, as well as support for building custom APIs. You can check them out here.
      • For those wanting more control, the TensorFlow Lite Interpreter API offers greater flexibility, allowing you to fine-tune how your models operate.

      And here comes the cherry on top! TensorFlow Lite doesn’t just stop at providing pre-built models; it also offers seamless integration with both Google Play services and Firebase for added efficiency and flexibility. TensorFlow Lite in Google Play services is available through both the Task and Interpreter APIs, which helps keep your app light and fast. Coupling it with Firebase allows you to update your models in real-time and even test different versions. It’s an ideal setup for any mobile platform, whether Android or iOS, making it particularly useful for cross-platform development. Check out TensorFlow Lite’s Guide to learn more. For sample apps and pre-trained models, check out TensorFlow Lite Example Apps.

      Alright, now that you’ve got the lowdown on MLKit and TensorFlow Lite, let’s tackle the Grand Final question—MLKit or TensorFlow Lite? Which one’s going to be the star player in your app’s lineup?

      The Choice

      So, you’re down to making a choice between ML Kit and TensorFlow Lite. Let’s break down the pros and cons so you can make an informed decision.

      Simplicity vs. Flexibility

      • ML Kit: Think of this like a meal kit delivery service. Everything is ready to go, just some quick assembly required. If you want to add machine learning to your app without a deep dive into the subject or no knowledge about ML, ML Kit is your go-to.
      • TensorFlow Lite: This is your kitchen, fully stocked but you have to cook. You can start with basic recipes (pre-built models), adapt existing ones, or even create your own from scratch. It’s for those who want more control over their “menu”.

      Time and Resources

      • ML Kit: Since most of the work is already done for you, your time-to-market could be very fast. Plus, you don’t need a team of ML experts.
      • TensorFlow Lite: It’s more time-intensive. You’ll likely need a more specialised team to design, train, and implement your models. Once the model is prepared, the integration in the app is relatively easy as there is comprehensive documentation and set of examples available for reference.

      App Size and Performance

      • ML Kit: Dynamic model downloading via Google Play Services helps in reducing the initial app size.
      • TensorFlow Lite: Supports dynamic API downloading through Google Play Services for a smaller app size. Also benefits from hardware acceleration for increased speed. Models can be hosted on Firebase for real-time updates without affecting the initial app size.

      Scalability

      • ML Kit: A bit like sticking to the restaurant’s menu. If they add a new dish, it’s easy to try it. But, you are limited to options and taste.
      • TensorFlow Lite: More like having your own garden, you grow what you eat, giving you full control to adapt as your business evolves.

      Data Privacy

      Both approaches process data on-device, which is excellent for user privacy.

      Cross-Platform Support

      Both ML Kit and TensorFlow Lite offer iOS support, making either choice suitable if you’re looking at a cross-platform application. TensorFlow Lite goes the extra mile by allowing your models to be used on the web too.

      Costs

      • ML Kit: Totally free as of writing this article!
      • TensorFlow Lite: The primary costs involve human resources, the time needed for model training and the computational power you’ll need for that.

      The Final Word

      For Business Stakeholders: If rapid development and ease of implementation are your primary concerns, ML Kit is your go-to. But if you’re planning a long-term, highly specialised product, investing in TensorFlow Lite could give you a competitive edge.

      For Developers: If you’re just getting your feet wet in the world of ML, ML Kit is a great start. Once you’re comfortable and find limitations in the existing models, you can take the plunge into TensorFlow Lite.

      In essence, your choice ultimately boils down to what you need, how much you’re willing to invest, and how adventurous you’re feeling. Bon appétit!

      Where To Go From Here?

      So there you have it—a snapshot of where we currently stand with on-device machine learning, not just on Android but also somewhat on iOS, featuring both TensorFlow Lite and ML Kit as our stars.

      Now, if you’re pondering the same questions that I had when I first dived into this arena, here are some steps you can follow:

      1. What Can Your App Do Better?: This is the most crucial question, and I can’t emphasise it enough. While it’s tempting to add cutting-edge features to your app, first consider if they’re genuinely needed or you need ML for it. In other words, are you thinking of using a hammer for a job that a stick can do? So, before diving into frameworks or code, take a moment to think about how ML can add value to your app or project.
      2. Explore ML Kit: Now that you have some concrete ideas, look through the ML Kit APIs, check out sample apps, and perhaps create a small practice project to test the waters. If ML Kit offers the features you need, your path is clear: proceed with implementation.
      3. Understand the Basics: If you’re comfortable with ML Kit but find it lacking for your specific needs, it’s time to delve into the basics of machine learning. Understand what a model is, and get a grasp on training and testing. For those who feel the need for professional guidance in this complex field, speak to us at Mantel Group for valuable insights.
      4. Deep Dive into TensorFlow Lite: If you’re contemplating more customised solutions, explore TensorFlow Lite documentation, sift through some example apps, and once you’re comfortable, plan your next steps accordingly.
      5. Go Hybrid: If you need a little bit of this and a little bit of that, don’t let the decision fatigue stop you! Go hybrid. It’s your menu, you select the ingredients!

      And that’s a wrap! Here’s to your next machine learning adventure—stay curious and keep exploring!