Armdromeda, it's really cool findings, and from what I saw on this forum, I think is the most closest to «perfect sync», which I think is not possible without hardware solution.
I developing a control app, which control all 6 cams, monitor all statuses, uses GPS, control exposure in real-time and many other stuff, which is not related to topic, but is also uses time sync. All cameras have a static IP's and connects to small portable router which powers from little powerbank. So, my solution is totally portable and 'internet free'.
I use JSON time sync. Every time when I connect to the cameras for the first time I sent current time from control device, which in my case is Windows tablet, to all cams using those JSON packets. This packet as far as I know allows to send 1 sec accurate time to cameras. This was done in order to get correct capture time metadata on photos and videos, but not as shutter trigger. Even this time sync is useful, because second software later copy all files to disk using 6 small card readers, and before starting to copy it looks for all file lists from all cameras and finds closest capture time (plus-minus 4 sec) on photos and videos on all SD_card, which give you 'a take' – group of photos which was taken in one time, so you don't need to mess with copy and sorting hundreds of takes and find 'groups' of photos later on post.
I saw your post and I think it's a great technique, and later will try to test ability to sync time using your method. But it uses internet to get correct time from server, which is 'ruin' portability in my case. What I think is may be possible to install some local time server on the phone or on the router. I use openwrt on the router, and I think it will be possible to install some solution to get something similar to 'ntp' service (or may be it's built in in Linux?), and your finding will help to improve those 'take findings' mechanism. And since it will be there, may be it also be possible to develop shutter releasing functionality based on this 'precise' time sync.
p.s.
Also, not related to particular conversation, but related to topic – I read a lot about 360 last years. I have an interest in 360 video particulary since early 2000's. Was developing a rig using gopro 2 model for one of the first 360 music videos in the world, shooting and make post, when there was no video stitching software at all, stitchting it frame-by-frame, was betatestin this KOLOR videostitching software and their 360 web-videoplayer… so, a lot of reading and researching past years
And what I saw is a lot of 360 reviewers and enthusiasts a bit misinterpreted 'genlock' term. Most of the time genlock means just simultaneous control all settings for all cams, which GoPro remote shutter does, for example, but now shutter sync. As far as I know, REAL genlock can do only a couple of 360 cams for now (NOKIA, for example, and couple of cams which is in BETA now or on kickstarter, and couple of cams which is completely 'black horses', coz no demos and tech specs on them at all right now) and it's not possible in Yi first model at least today. Yes, I saw posts about 'big Yi rig' which is also 'genlocked', but I think it just means that you can control all settings, but not shutter sync. Also saw a solution, some blog where guy uses Gopro stereo sync device and mini arduinos to get perfect shutter sync, but couldn't find any demos, unfortunately, and any review who was really tested this solution.
What I really enjoy, is the enthusiasm of the community and 'researcher itch', when people can try make all possible to get 101% from device. It's deserves respect.