• Latest
  • Trending
Google AI Improves The Performance Of Smart Text Selection Models By Using Federated Learning

Google AI Improves The Performance Of Smart Text Selection Models By Using Federated Learning

November 30, 2021
Just-In: Ethereum Merge Most Likely In August, Says Vitalik Buterin

Just-In: Ethereum Merge Most Likely In August, Says Vitalik Buterin

May 20, 2022
Trader Predicts Crypto Market Will Mimic 2018 Bear Season – Here’s How High Bitcoin Could Go Before Nuking Lower

Trader Predicts Crypto Market Will Mimic 2018 Bear Season – Here’s How High Bitcoin Could Go Before Nuking Lower

May 20, 2022
Terraform Labs, Luna Foundation Guard Bought 3.06m AVAX in total: Avalanche Foundation

Terraform Labs, Luna Foundation Guard Bought 3.06m AVAX in total: Avalanche Foundation

May 20, 2022

TD SYNNEX expands solution offering with Google Cloud

May 20, 2022

Creating an ML Web App and Deploying it on AWS

May 20, 2022
Will Fan Tokens Replace Memecoins Like Shiba Inu and Dogecoin?

Will Fan Tokens Replace Memecoins Like Shiba Inu and Dogecoin?

May 20, 2022
Goldman Sachs: Crypto Drawdown Will Have Little Impact on U.S. Economy

Goldman Sachs: Crypto Drawdown Will Have Little Impact on U.S. Economy

May 20, 2022
Crypto Bear Market: Pantera Partner Sees These Buying Opportunities

Crypto Bear Market: Pantera Partner Sees These Buying Opportunities

May 20, 2022
Australias Commonwealth Bank Halts Crypto Rollout

Australias Commonwealth Bank Halts Crypto Rollout

May 20, 2022
Commonwealth Bank puts crypto trading trial on ice as regulators dither

Commonwealth Bank puts crypto trading trial on ice as regulators dither

May 20, 2022
Ethereum devs tip The Merge will occur in August ‘if everything goes to plan’

Ethereum devs tip The Merge will occur in August ‘if everything goes to plan’

May 20, 2022
Beware, Bitcoin Jumping Back Above $30,000 Could Be A Dead Cat Bounce, Here’s why

Beware, Bitcoin Jumping Back Above $30,000 Could Be A Dead Cat Bounce, Here’s why

May 20, 2022
Deep Tech Central
Sunday, May 29, 2022
Subscription
Sign Up
  • News
    • Artificial Intelligence
    • Crypto
    • CyberSecurity
    • IoT
    • Robotics
    • Quantum Computing
    • Sustainability
    • Telecom
  • Videos
  • DTC – UNV
No Result
View All Result
Deeptech Central
No Result
View All Result

Google AI Improves The Performance Of Smart Text Selection Models By Using Federated Learning

by
November 30, 2021
in Artificial Intelligence
0

Smart Text Selection is one of Android’s most popular features, assisting users in selecting, copying, and using text by anticipating the desired word or combination of words around a user’s tap and expanding the selection appropriately. Selections are automatically extended with this feature, and users are offered an app to open selections with defined classification categories, such as addresses and phone numbers, saving them even more time.

The Google team made efforts to improve the performance of Smart Text Selection by utilizing federated learning to train a neural network model responsible for user interactions while maintaining personal privacy. The research team was able to enhance the model’s selection accuracy by up to 20% on some sorts of entities thanks to this effort, which is part of Android’s new Private Compute Core safe environment.

YOU MAY ALSO LIKE

Creating an ML Web App and Deploying it on AWS

Now You Don’t Need To Present Your Credit Card At Checkout If You Bind Your Facial Images/ Hand Features To Your MasterCard Credit Card

The model is trained to only select a single word to reduce the incidence of making multi-word selections in error. The Smart Text Selection feature was first trained on proxy data derived from web pages that had schema.org annotations attached to them. While this method of training on schema.org annotations was effective, it had a number of drawbacks. The data was not at all like the text users viewed on their devices.

With this new release, the model no longer uses proxy data for span prediction and instead employs federated learning to train on-device on real interactions. This is a machine learning model training method in which a central server organizes model training across several devices while the raw data remains on the local device. 

The following is how a typical federated learning training process works: 

1. The model is initialized first by the server. 

2. Then, in an iterative process, 

devices are sampled, selected devices improve the model using their local data, andonly the improved model, not the data used for training, is sent back. 

3. The server then takes the average of the modifications and creates the model that is sent out in the following iteration.

For Smart Text Selection, Android receives accurate feedback for what selection span the model should have predicted each time a user taps to choose the text and corrects the model’s suggestion. To protect user privacy, the choices are held on the device for a short time without being seen on the server and then utilized to enhance the model using federated learning techniques. This strategy has the advantage of training the model on the same data that it would encounter during inference.

Because raw data is not available to a server, one of the advantages of the federated learning strategy is that it allows for user privacy. Instead, only updated model weights are sent to the server. To empirically validate that the model was not memorizing sensitive information, the team used methods from Secret Sharer, an analysis approach that assesses the degree to which a model mistakenly memorizes its training data. Furthermore, data masking techniques were also used to prevent the model from ever seeing certain types of sensitive data.

Initial attempts to use federated learning to train the model were unsuccessful. The loss did not converge, and the predictions were all over the place. Because the training data was collected on-device rather than centrally, debugging the process was impossible because it could not be examined or confirmed. To get around this problem, the research team built a set of high-level indicators to see how the model fared throughout training. Among the metrics employed were training examples, selection accuracy, and recall and precision measures for each object type.

Smart Text Selection may now be scaled to many more languages thanks to this new federated technique. This should ideally work without the need for human system tuning, allowing even low-resource languages to be supported, making lives easier for billions of users around the world. 

Reference: https://ai.googleblog.com/2021/11/predicting-text-selections-with.html

The post Google AI Improves The Performance Of Smart Text Selection Models By Using Federated Learning appeared first on MarkTechPost.

Share196Tweet123Share49

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

I agree to the Terms & Conditions and Privacy Policy.

Search

No Result
View All Result

Recent News

  • Just-In: Ethereum Merge Most Likely In August, Says Vitalik Buterin
  • Trader Predicts Crypto Market Will Mimic 2018 Bear Season – Here’s How High Bitcoin Could Go Before Nuking Lower
  • Terraform Labs, Luna Foundation Guard Bought 3.06m AVAX in total: Avalanche Foundation
  • About
  • Privacy Policy
  • Sign Up
  • Contact Us
  • About
  • Contact
  • Deeptech Central
  • Elementor #10628
  • Newsletter
  • Privacy Policy
  • Sign Up

© 2018-2021 DeepTech Central. - by MintMore Inc..

No Result
View All Result
  • News
    • Artificial Intelligence
    • Crypto
    • CyberSecurity
    • IoT
    • Robotics
    • Quantum Computing
    • Sustainability
    • Telecom
  • Videos
  • DTC – UNV

© 2018-2021 DeepTech Central. - by MintMore Inc..

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.

Stay Updated. Subscribe Today.

Join the community of 10K+ scholars & entrepreneurs.