Last Updated: September 01, 2023, 13:52 IST
Google is looking to find the best solution to tackle the challenges posed by artificial intelligence (AI). As we all know, AI-generated images have become common on the internet, not just limited to social media. Media companies have started regulating the use of such content to avoid any possible copyright infringement. But it is safe to say that some of the AI tech can make it hard to tell whether a photo has been created using AI or manually developed from scratch.
Google wants to make that process easier for people and is using its AI division DeepMind to create a tool that can detect AI-generated images and even add a watermark to them, which helps people immediately recognise the origin of the photo.
The AI division has released the beta version of the software called Synth ID which will add a watermark to the pixels of an image that might not be visible to our eyes but the technology will be able to detect the identity of the image.
Having the software in beta version, allows DeepMind to keep testing new unreleased features with a limited group of users and companies. Google is using its extremely resourceful deep learning models to conceive the watermark tool and the company claims that adding the mark will have no implication on the quality of the image, colour levels and other aspects.
While the efforts from Synth ID is a step in the right direction, the pace at which AI tech is evolving, the industry will need more enhanced solutions to help avoid major catastrophe caused by AI misinformation, especially in cases where Deepfakes are involved. With major government elections lined up in the next 12 months, it is imperative that the beta model of Synth ID delivers on its promise and helps the world avoid the perils of using AI for wrongful practices.
Google has talked about being responsible for regulating AI but its continuous efforts to gain control of the segment will need to have other players involved for easier and faster access to products and tools like these.