Apple and IBM may seem like an odd couple, but the two companies have been working closely together for several years now. That has involved IBM sharing its enterprise expertise with Apple and Apple sharing its design sense with IBM. The companies have actually built hundreds of enterprise apps running on iOS devices. Today, they took that friendship a step further when they announced they were providing a way to combine IBM Watson machine learning with Apple Core ML to make the business apps running on Apple devices all the more intelligent.

The way it works is a customer builds a machine learning model using Watson, taking advantage of data in an enterprise repository to train the model. For instance, a company may want to help field service techs point their iPhone camera at a machine and identify the make and model to order the correct parts. You could potentially train a model to recognize all the different machines using Watson’s image recognition capability.

The next step is to convert that model into Core ML and include it in your custom app. Apple introduced Core ML at the Worldwide Developers Conference last June as a way to make it easy for developers to move machine learning models from popular model building tools like TensorFlow, Caffe or IBM Watson to apps running on iOS devices.

After creating the model, you run it through the Core ML converter tools and insert it in your Apple app. The agreement with IBM makes it easier to do this using IBM Watson as the model building part of the equation. This allows the two partners to make the apps created under the partnership even smarter with machine learning.

“Apple developers need a way to quickly and easily build these apps and leverage the cloud where it’s delivered. [The partnership] lets developers take advantage of the Core ML integration,” Mahmoud Naghshineh, general manager for IBM Partnerships and Alliances explained.

To make it even easier, IBM also announced a cloud console to simplify the connection between the Watson model building process and inserting that model in the application running on the Apple device.

Over time, the app can share data back with Watson and improve the machine learning algorithm running on the edge device in a classic device-cloud partnership. “That’s the beauty of this combination. As you run the application, it’s real time and you don’t need to be connected to Watson, but as you classify different parts [on the device], that data gets collected and when you’re connected to Watson on a lower [bandwidth] interaction basis, you can feed it back to train your machine learning model and make it even better,” Naghshineh said.

The point of the partnership has always been to use data and analytics to build new business processes, by taking existing approaches and reengineering them for a touch screen.

“This adds a level of machine learning to that original goal moving it forward to take advantage of the latest tech. “We are taking this to the next level through machine learning. We are very much on that path and bringing improved accelerated capabilities and providing better insight to [give users] a much greater experience,” Naghshineh said.

Tags: