Synchronisation of sound and vision
mkoob at rogers.com
Fri Feb 14 10:56:16 EST 2020
I have an application created with LiveCode that uses callbacks from the player to synchronize annotations to the video played in the player. I find the callbacks very reliable as far as sending the callback messages. Links are represented on a timeline by vertical lines. I have various types of annotation data attached to the link - text label, multi line text comment, linked video comment, color of link and some actions like stop main video, show linked video, play linked video. As well there is a start and end time for a selection of the main video. I have other types of annotation data on my planned feature list. One is to have what your are talking about — a scrolling text field that would scroll to a certain point as specified in the annotation data. I have some rough ideas of how I would implement it but haven’t gotten to it yet.
My application also captures the video and audio in .mov files using the new camera control. These are saved in a project file that also contains a text file that contains all the callback times with the annotation data associated with the callback time. The application is cloud based so projects can be shared with others users.
My initial target market has been for sign language interpreter training and testing. However there is nothing preventing it being used used for spoken languages, I just have not been targeting that market yet.
In the past I did some experimenting and have opened audio files in the player and It worked. I have not done that with LC 9.x so can’t say if that still works. (I am on a holiday so can’t try it out. I will try it out when I am back and let you know.)
With my application as it currently works the work flow I could see for your case is that the student creates a project and records themselves reading the poem. Then they can open the project in annotation mode and create the links in the timeline at the points you want. The text for each section of the poem could be entered into the multi line field in the links corresponding to each segment. When the video is played back its playback is can be automatically stopped each time a video link is triggered by a callback being fired and the text from that section is shown. Once they are done the student could then share it with you in the cloud and you can then review the files from the students and add further comments.
This doesn’t do exactly what you want but you could use that to see how well the callbacks work in an application. I would be interested in your thoughts on it after you give it a try
You can try out the application at VideoLinkwell.com. I can set up a free trial for you, just put a note on the comment page https://videolinkwell.com/contact/ and I can set up a free account for you so you can download the software and try it out.
(Note I am in the midst of finishing off an upgrade so I am hoping to have a new version with new features and bug fixtures out in the near future.)
Sent from my iPad
> On Feb 12, 2020, at 1:03 PM, Graham Samuel via use-livecode <use-livecode at lists.runrev.com> wrote:
> Thanks Tore, Devin, Peter and Alex! There is a lot to chew on here. I do in fact have one file per poem - the user of the program will see each poem as different object, as it were, so there would be no advantage to combining them. I will try to do some experiments shortly. Doubtless after that there will be more questions.
> The issue of user platform preferences (desktop or app etc) which is discussed by Peter must be a universal one. I have previously experienced the gotcha of school labs not wanting to install applications. But I am getting far ahead of myself, since there are so many other issues to consider before i get near to making a proper platform decision.
More information about the use-livecode