This change, which actually began to be applied last week, caused fears of experts on confidentiality, who considered this a violation of the principle of enlightened approval.
If you do not take action, Google Gemini may begin to interact with your messages, WhatsApp, Utilaes or telephone applications on Android, starting on July 7, 2025. Https://t.co/f9uuurbhgko.
– Techradar (@techradar) July 8, 2025
According to the e -letter that Google sent to the Android users, Jimini can now access the third applications, including WhatsApp, which allow users to release voice orders, such as “send a Human (human name)” so that artificial intelligence automatically realizes it. It should be noted that this function works even with the termination of the function of tracking the activities of Jimin, unlike previous versions.
Google confirms that Jimini does not store conversations locally on the device, and the data is sent to its servers for processing. Although the company claims that artificial intelligence does not read the content of messages or does not see the sent pictures, the official support page shows that user conversations can be considered by Google and its partners in order to improve the quality of the service, which encouraged the company to warn users to share any confidential information through the platform.
The new update was faced with harsh criticism from technological experts who indicated the difficulty of violation of this function, since Daniel Freich, Businessmagnet, called it the “heart of the principle of enlightened approval”, since virtual access became the basis if the user did not cancel the subscription manually. Meanwhile, Andy Kallen, the founder of the stuck system, called on users to carefully view the confidentiality settings and cancel any undesirable permits.
For his part, Google defended the update as an improvement in user experience, as indicated on the technical website of “ARS Technica”, that the new function allows for daily tasks, such as sending messages and calling calls, even when tracking activity stops, emphasizing that negotiations are not used to improve intellectual models in this case. Nevertheless, the ambiguity in relation to the mechanism of the system and its actual access to personal data remains a source of anxiety for users, especially with the growing dependence on artificial intelligence in the processing of confidential information.
Source: Metro