The MLPerf Consortium, with Members like ARM & Google, have introduced Tech Industry's First Standard ML Benchmark Suit
It's been Revealed that Apple Hired a Major Chip Architect from ARM back in May

Apple's Latest Machine Learning Journal Article: Bridging the Domain Gap for Neural Models

1 Cover Apple ML report

 

Earlier this morning Patently Apple posted a report titled "The MLPerf Consortium, with Members like ARM & Google, have introduced Tech Industry's First Standard ML Benchmark Suit." Our report noted that while Apple isn't a member of this consortium focused on Machine Learning at the moment, more and more companies are continuing to join the consortium. It wouldn't be surprising to see Apple join the group that includes Silicon Valley elites such as Google, Intel, Cray, NVIDIA, ARM, AMD, Microsoft, Facebook and many others.   

 

Our report also listed the ongoing work Apple is doing in the area of Machine Learning, including its hardware application of A12 Bionic chip with Apple's Neural Engine. We also noted that Apple makes a contribution to the community of Machine Learning developers with their Machine Learning Journal which added a new article in June titled " Bridging the Domain Gap for Neural Models."

 

Apple's latest entry begins by stating that "Deep neural networks are a milestone technique in the advancement of modern machine perception systems. However, in spite of the exceptional learning capacity and improved generalizability, these neural models still suffer from poor transferability. This is the challenge of domain shift—a shift in the relationship between data collected across different domains (e.g., computer generated vs. captured by real cameras). Models trained on data collected in one domain generally have poor accuracy on other domains.

 

In this article, we discuss a new domain adaptation process that takes advantage of task-specific decision boundaries and the Wasserstein metric to bridge the domain gap, allowing the effective transfer of knowledge from one domain to another. As an additional advantage, this process is completely unsupervised, i.e., there is no need for new domain data to have labels or annotations."

 

Overall, the report covers an overview, method, Learning with the Sliced Wasserstein Discrepancy and Experiments (Manifold visualization and Semantic segmentation).

 

In their conclusion, Apple notes: "This method of unsupervised domain adaptation helps improve the performance of machine learning models in the presence of a domain shift. It enables training of models that are performant in diverse scenarios, by lowering the cost of data capture and annotation required to excel in areas where ground truth data is scarce or hard to collect. The technique can enable personalized machine learning by on-device adaptation of models for enhanced user experiences."

 

Two of the article slides are presented below.

 

Apple's Figure 3 below represents a family of strategies tackles the domain adaptation problem by using an adversarial loss at different levels.

 

2  x2 Apple slide related-work_2x

 

Apple's patent Figure 5 below represents an illustration of the sliced Wasserstein discrepancy (SWD) computation. The SWD is designed to capture the dissimilarity of probability measures p1 and p2 in Rd between the task-specific classifiers C1 and C2 , which take input from feature generator G. The SWD enables end-to-end training directly through a variational formulation of Wasserstein metric using radial projections on the uniform measures on the unit sphere Sd-1, providing a geometrically meaningful guidance to detect target samples that are far from the support of the source.

 

3 x apple ml slide june 2019

 

Read Apple's full ML article here.

 

10.0F - Apple News Bar

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.

Comments

The comments to this entry are closed.