Is this any different from photoshop?
There was a time when if you saw a photo then what was depicted must have happened. Then photoshop came along and there was a short time when some people could be fooled by a photoshopped image. Then everyone learned and photoshop became an everyday tool.
I see no reason to think deep fakes will be any different. Once we learn we cannot trust video this will be no different to photoshoping someone head into a sex scene.
I do not believe there is much future for freely available AI image detecting software. The nature of generative adversarial networks means people will just incorporate whatever detection tool in the learning algorithm.