Skip to content

Inside Muybridge’s Real-Time Camera Arrays

By: Stephen Toback

Close-up of a row of colorful LED lights beneath a planter filled with vibrant red flowers and green leaves. The image focuses on the lights and the base of the planter.

Most “new” broadcast cameras are just better sensors in the same old boxes. Muybridge, a deep-tech startup based in Oslo, is different because it replaces a single, high-cost lens and motor with a dense array of low-cost sensors and massive real-time compute. Founded in 2020 by Håkon Espeland and Anders Tomren, the company operates on the belief that technology should not burden the storyteller, but rather provide “limitless perspectives” through software-defined movement.

How it Works: The Hardware

Instead of a $100,000 broadcast rig on a dolly, the Muybridge system consists of 2-meter “speaker bar” style units.

  • Sensor Arrays: Each bar is packed with dozens of smartphone-grade sensors. These are significantly cheaper than traditional glass but provide massive “pixel overlap” for volumetric imaging.

  • Continuous Stitching: Multiple 2-meter units can be linked together to create a single, continuous “camera” of any length—for example, the entire baseline of a tennis court.

  • Software-Defined Movement: There are zero moving parts. To “pan” or “zoom,” the software simply shifts which part of the sensor array it is processing. This creates “weightless” movement with zero mechanical latency, allowing for unlimited speeds and instant perspective shifts.

  • Broadcast Ready: The platform is natively compliant with industry standards like SDI, ST-2110, and NDI®, plugging directly into professional workflows.

Deployment: Where is it Now?

As of early 2026, Muybridge is moving past the “proof of concept” phase and into live professional broadcasts:

  • The ATP Tour (Tennis): Muybridge has an exclusive partnership with Sony’s Hawk-Eye Innovations. It was piloted at the Miami Open and Madrid Open and is being used across the 2026 ATP Masters tournaments, including Indian Wells this March. You can see some footage here

  • The NHL and Fox Sports: The company is working with the NHL to install these arrays inside dasher boards to create a “virtual drone” view that follows the puck at speeds no physical camera could match.

  • Soccer: Initial pilots have been staged with major clubs to test goal-line and sideline tracking.

Applications: Duke Athletics and Research

Beyond the high-profile pro tours, this technology is a perfect fit for the unique spatial challenges of Duke Athletics.

  • Court Sports (Tennis/Volleyball): In sports like tennis, the system is installed directly on the lowest ad boards. It provides a “player’s eye” view of footwork and ball contact that is impossible for a human operator behind a fence to capture.

  • Duke Basketball / Tennis: Because the units are low-profile and attach to existing walls or boards, they solve the “sightline” problem in historic, tight venues like Cameron Indoor Stadium or the Ambler Tennis Stadium.

  • Scientific Research: In biomechanics, the “no approximation” math is the key. Unlike other systems that “fill in the blanks” using GPUs (which can take minutes to render), Muybridge uses raw pixel data from multiple overlapping angles to determine exact color and perspective in real-time. This allows Duke researchers to extract skeletal data and motion analysis without the lag of traditional post-processing.


The Theoretical Foundation: Space, Time, and Frame

The bridge between modern technology and historical photography is best explored in the work of Mark J. P. Wolf. In his study, “Space, Time, Frame, Cinema: Exploring the Possibilities of Spatiotemporal Effects”, Wolf argues that the discovery of new cinematic effects is often hindered by default ways of thinking about shot design.

Wolf proposes a grid to categorize how cameras move through space and time:

  • Moving Camera Shot: The camera moves through both space and time, recording temporally sequential images.

  • Static Camera Shot: The camera moves through time but remains fixed in space, such as when mounted on a tripod.

  • Frozen Time Shot: The camera moves through space but not through time.

In a “frozen time” shot, every frame in a sequence is of the exact same instant but shows the subject from a series of different points in space. Wolf identifies this as a “hole in the grid” that remained empty long after other shot types were perfected.

The Legacy of Eadweard Muybridge

The company is named in direct homage to Eadweard Muybridge (1830–1904), the 19th-century pioneer of sequential photography. In 1877, railroad magnate Leland Stanford hired Muybridge to prove that all four hooves of a galloping horse leave the ground simultaneously—a feat he achieved by inventing a system of shutter releases and 12 cameras with tripwires.

As Wolf points out in his research, Muybridge’s legacy contains a significant “what if”. While Muybridge experimented with setting cameras in a semicircle, he never synchronized them to activate at the exact same instant to exploit the frozen time effect. Had he connected the tripwires to trigger all cameras simultaneously, he would have discovered the effect—often called temps mort—more than a century before it became a digital staple.

By naming the device after him, the modern Muybridge company is effectively completing his unfinished work. They have taken the 1877 concept of a camera array and updated it with 21st-century AI and software-defined movement, finally allowing us to navigate the “frozen time” that Eadweard Muybridge could only hint at with his tripwires.

Pricing and Availability

Muybridge currently operates primarily through B2B partnerships and licensing, most notably through the Sony/Hawk-Eye relationship. While they haven’t released a standard “MSRP” for a single bar, the core of their pitch is the use of commodity electronics. By utilizing the same sensors found in iPhones, they claim the hardware is significantly more affordable and scalable than traditional broadcast robotics.


References:

  • Muybridge Product & Leadership (https://www.muybridge.com/about)

  • Space, Time, Frame, Cinema: Exploring the Possibilities of Spatiotemporal Effects by Mark J. P. Wolf.

  • Animal Locomotion (1887) by Eadweard Muybridge, University of Pennsylvania Archives.

 

Categories: Cameras

Leave a Reply

Your email address will not be published. Required fields are marked *