Dan Carew-Jones at Arrow International Media explains how covid-restrictions led to previously untapped opportunities using AI
We have been discussing and testing AI systems at Arrow International Media for many years. I’ve always been impressed by the results and realised that it offered intriguing solutions. The issue was always identifying the problem – productions already had logs that were rigorously prepared on the shoots so locating footage has never been an issue.
The Covid pandemic presented us with the problem. We were in the midst of production on several series and needed a way to finish the shows that were at various stages of completion. The answer for us lay in the library of unused GVs and drama reconstruction that we had accumulated since our inception. We had housands of hours of footage across hundreds of thousands of clips.
The clips hadn’t been logged with generic use in mind, so we needed a way to know what was contained in those hundreds of thousands of clips. Step forward GrayMeta who introduced us to their Curio platform, offering object tagging of video at the scale and in the timeframe we needed.
Within two weeks we had the footage analysed and were able to search for specific ‘needles’ in our metaphorical haystack.
It became apparent that Arrow’s decision to embrace cloud storage for native rushes as well as viewing proxies opened up a world of ‘Big Data’ analytical possibilities because it removed the initial barrier, namely delivering the data to the analysis tools.
As we delved deeper into the possibilities the, previously unappreciated, benefits of the varied services became apparent. Optical Character Recognition is a case in point. Not particularly revolutionary in principle (it has been around for decades), but the quality of the results was a revelation. Now the data was available, we were able to identify police departments by the insignia they wore, or the badges on the police vehicles. This in turn gave us a library of locations to search. Anyone wearing a name tag of any sort could be identified by name and located. Specific locations could be located by their signage.
Now we are emerging from the pandemic we are still leveraging the content of our archive, but also extending the integration of AI services into our production workflows to assist in managing the ever increasing amounts of data generated on a shoot. A recent production was shooting upwards of 20 hours per day in the US. Overnight the rushes were uploaded, processed for edit and transcoded to web proxy. This triggered an automatic transcription service, and the content analysis of the footage. Within 24 hours of being shot in the US, the footage was available in the edit with a transcription and was searchable in our cloud viewing platform.
There are, of course, limitations. It is an algorithm doing the work and, as we have learnt, algorithms alone can produce spurious results. Mistakes are to be expected and the results still require human interpretation, but the material to interpret is reduced to a realistic and manageable level. It is a triage process, not microsurgery.
Arrow continues to embrace the possibilities of AI and ML, with services proliferating and improving with dizzying speed. Footage restoration, colourisation, and categorisation, facial recognition, metadata enhancement and integration are all areas that we are studying. Some will prove to have a practical application, some may not. It depends on the nature of the next ‘problem’.
Dan Carew-Jones is post production consultant at Arrow International Media
No comments yet