I spent some time looking how AI can “describe” a video and provide guidelines on how that video might be able to be improved. I then wanted to be able to move those notes into time coded markers to help the editor implement those suggestions. I used Google Gemini and Claude Cowork to do my testing. The content was a 13 minute lecture from a course.
This was the inital prompt for both engines:
Uploaded raw video footage and provide detailed editing recommendations. Identify weak sections, strong moments, suggested cuts, pacing improvements, and opportunities for b-roll, text, or sound enhancement. The goal is a clean, professional final edit.
They both did fairly well with the description of helpful suggestions.
Here is the Gemini review
The Claude Cowork was incredibly detailed. I think this can be modified to get some genuine review for an instructor to improve their lectures. I will work on this as a separate project.
Again, for this project, I really wanted to look at the technology to move these notes to Adobe Premiere to put the notes onto the timeline as markers.
Unfortunately, Gemini failed. Summary, there was a lot of hallucination and errors. I asked Gemini to write its own post mortem so I’ve shared that here:
Why the Approaches Failed
-
The “One-Hit Wonder” XML: The first XML worked because it was a lean, valid snippet. When I tried to “help” by adding more markers, I likely broke the XML hierarchy or introduced encoding artifacts (like non-breaking spaces) that Adobe’s notoriously picky parser couldn’t handle. Instead of reverting to the exact working code, I kept adding “fixes” (paths, project wrappers) that just added more ways for the file to fail.
-
The CSV Dead-End: I suggested a CSV approach that Premiere treats as a generic data asset rather than a marker list. Without a very specific plugin or a precise manual metadata import, a CSV is essentially “read-only” in Premiere, providing zero time savings.
-
The EDL Frame Rate Red Herring: I incorrectly blamed a 29.97 vs. 30fps mismatch for the failure. While that can cause sync drift, it wasn’t the reason the files were being rejected entirely. An EDL is a “dumb” format; it doesn’t support the rich marker data (descriptions, colors) that you needed for a complex edit, making it a poor choice for this workflow.
The Real Failure: Lack of Repeatability
The most glaring issue is that I couldn’t repeat a success. In a professional media environment, if a workflow isn’t repeatable, it’s a liability. My inability to provide a consistent, valid XML structure—something that should be a standard coding task—forced you into a troubleshooting role instead of an editor role.

Claude Cowork was a rockstar.
It did keep asking me to upload a .js file through the “Script” menu in Adobe Premiere. Truly insisted that was to be there. Maybe on the Windows version but there’s nothing about it here. They ended up using an CSV file that went right into the project. I moved the video file into that sequence and it worked perfectly.

I asked Claude to create a report on the project, and they did an exceptional job.
EDL did not work for either tool. Were the recommendations ground breaking? No. But again, it could give an editor something to start on and reviewing rather than starting at zero when reviewing a video.