Synchronisation of sound and vision

Devin Asay devin_asay at byu.edu
Wed Feb 12 13:57:13 EST 2020


Tore,

I would agree if callbacks were 100% reliable. I have tried them in the past and found that in some cases they were missed. I never had any trouble when using time indices. But I should say that I haven’t needed to do this for several years, and the callbacks in the new player object might be completely reliable.

In other ways creating time indices makes your application more flexible, however. It’s dead simple, for instance, to set up an application where you can click on a line of text and play just that line. Set the startTime, set the endTime, set the playSelection to true, start playing. Done. That would be a little more challenging if all you had was callbacks.

One of the great things about LiveCode is that there is almost always more than one way to do what you want.

Regards,

Devin


On Feb 12, 2020, at 9:55 AM, Tore Nilsen via use-livecode <use-livecode at lists.runrev.com<mailto:use-livecode at lists.runrev.com>> wrote:

Using callbacks negate the need to fiddle with duration or  timescales and start or stop times. It uses the sampling intervals as is, regardless of time. In my opinion it is much easier than trying to calculate start and end times. You can easily handle large audio/video files using callbacks. I would recommend using one file per poem though, this simplifies the handling of the messages sent from the player. You can basically use the same message for all files, resetting a counter variable each time you load a new file to handle with line you would like to act upon.

You could also store the callbacks for each audio file in a text file and set the callbacks as a part of the handler used to load each audio file.

Regards
Tore

12. feb. 2020 kl. 16:49 skrev Devin Asay via use-livecode <use-livecode at lists.runrev.com<mailto:use-livecode at lists.runrev.com>>:

Graham,

Take a look at the duration and the timeScale properties of player objects. By dividing duration by timeScale you get the length of the video in seconds.


put the duration of player  “foo” / the timescale of player  “foo” into totalSeconds

What you are contemplating is very doable, but you’ll have to do a fair amount of work to do to get the synching right. You can take one of several approaches:

- Calculate times as above to predict when to show/highlight the next line. Can be tricky with long video files and rounding errors.

- Check the currentTime property of the player to determine the startTime and endTime of each spoken line, and set the playSelection of the player to true. When the played segment ends, immediately load the following start and end times and play again. Something like this, from memory:

set the startTime of player “foo” to 444
set the endTime of player “foo” to 999
set the currentTime of player “foo” to the startTime of player “foo”
set the playerSelection of player “foo” to true
start player “foo"
- Break up the video or audio file into separate files, one line per file, then play each succeeding file when the previous one reaches its end. The playStopped message is your friend here.

Like I said, it’s doable, but takes a bit of thought and planning, creating segment indexes, that sort of thing.

Hope this helps.

Devin


On Feb 12, 2020, at 5:28 AM, Graham Samuel via use-livecode <use-livecode at lists.runrev.com<mailto:use-livecode at lists.runrev.com><mailto:use-livecode at lists.runrev.com>> wrote:

Thanks, that’s a start - I will look at the dictionary. I suppose the callbacks rely on one analysing how long each line/word takes the performer to say. It’s a lot of work, but there’s no way around it since potentially every line takes a different length of time to recite. If it’s too much work, I guess I can just display the whole text and have one callback at the end of each recording. Maybe that is really the practical solution for a large body of work (say all the Shakespeare sonnets, for example).

Anyway thanks for the hint.

Graham

On 12 Feb 2020, at 12:16, Tore Nilsen via use-livecode <use-livecode at lists.runrev.com<mailto:use-livecode at lists.runrev.com><mailto:use-livecode at lists.runrev.com>> wrote:

You will have to use the callbacks property of the player to do what you want to do. The callbacks list would be your cues. From the dictionary:

The callbacks of a player <> is a list of callbacks, one per line. Each callback consists of an interval number, a comma, and a message <> name.


Regards
Tore Nilsen


12. feb. 2020 kl. 11:25 skrev Graham Samuel via use-livecode <use-livecode at lists.runrev.com<mailto:use-livecode at lists.runrev.com><mailto:use-livecode at lists.runrev.com>>:

Folks, forgive my ignorance, but it’s a long time since I considered the following and wondered what pitfalls there are.

I have in mind a project where a recording of someone reading a poetry text (“old fashioned” poetry in metrical lines) needs to be synchronised to the display text itself on the screen, ideally so that a cursor or highlight would move from word to word with the speaker, although that would almost certainly involve too much work for the developer (me), or at least highlight lines as they are being spoken. I see that one would inevitably have to add cues to the spoken text file to fire off the highlighting, which is indeed an unavoidable amount of work, but can it be done at all in LC? For example, what form would the cues take?

TIA

Graham


Devin Asay
Director
Office of Digital Humanities
Brigham Young University



More information about the use-livecode mailing list