I’ve borrowed and rented a couple of slide scanners over the years but they are black boxes when it comes to color management. I grew frustrated so I designed and built my own. Now I control everything from the spectral emission of the backlight to the color science, debayer, and color transforms on the sensor side. I’m currently using an old slide viewer from the 50s as the temporary light source until I can build my own custom setup. I just thought it would be cool to use this little guy for the prototype. I found it in an antique barn in Idaho for five bucks. Since I live in Brooklyn, I thought it was kinda neat to bring this little contraption home and give it new life.
Computer Vision + Drone Imagery = Smarter Cities
In 2017 I co-founded a startup called Quantifly and worked for several years as a manager and technical consultant. We were accepted into the Genius NY program which is the largest accelerator in the world for drone technology. We took second place at the finals and received $600,000 in funding along with several other grants and support. I helped Quantify design, build, and implement a complex and sophisticated aerial data collection and analysis pipeline. I hired a team of engineers and researchers to develop a computer vision-based algorithm for identifying cars from aerial imagery. The algorithm was also able to detect change between subsequent temporal layers. I was in on the ground floor and followed the entire process through to successful completion. I flew multi rotor and fixed wing drone systems, built and tested custom camera payloads, installed and tested extremely precise real-time geolocation and tagging computers, and created the image processing pipeline which fed into the CV algorithm. It was the most technical and challenging opportunity of my life and I’m always thankful for the experience.
When I began to experiment with drones back in 2011, I quickly realized their market potential expanded far beyond the film industry. Drones (or UAS as the FAA calls them) are machines which can move in three dimensional space, have the ability to carry sophisticated payloads, and can be automated. To a drone pilot, the world is full of potential. (I have more to say about the regulatory environment and its relationship to free enterprise within public airspace but you’ll have to buy me a beer to hear that)
I participated as a co-principle investigator on several research projects at Wayne State University including one paper titled “Channel Fading Statistics For Real-Time Data Transmission In Emergency Call Systems And Unmanned Aerial Systems” with Dr. Ynrui Li for his doctoral dissertation. I modified one of my commercial off the shelf drones to provide a robust testing platform in sub-zero temperatures in an urban environment for cellular signal path testing. Our submission to IEEE was rejected but the dissertation was a win! I also collaborated with another team of engineers and doctoral candidates at Wayne State to build and test an aerial platform for water quality testing. The concept included flying the drone in an automated pattern and dipping an instrument (IoT) into the water before transmitting the data back to the server in the form of an SMS using Arduino. We submitted our idea at a TechTown Hack Lake Erie hosted by the City of Detroit and NASA but ultimately did not win the competition.
Unmanned Aerial Systems Projects
Aerial Photogrammetry for Parking/Traffic Analysis
I designed, built and tested several unmanned aerial imaging systems for Quantifly LLC to provide the novel dataset necessary for parking and traffic analysis. I used these systems to map the downtown areas of several major American cities. One of our missions was to map the entirety of the new Detroit Riverwalk North of the GM headquarters. We then use a proprietary process to prepare the images for high precision photogrammetry for automated analysis.
Aerial Real-Time Kinematic Geolocation Image Tagging
Quantifly wanted to be able to overlay their orthorectified 2D aerial maps of urban areas on top of existing GIS data provided by municipalities in order to use as an additional layer for reference. This meant that I had to figure out a way to embed calibrated geolocation tags into the metadata of every image taken by the drone during the mapping process. I took a course in basic surveying and went to work with an RTK
Aerial Telephoto System for Tower Inspection
I modified a Micro 4/3 gimbaled sensor system to accommodate a 300mm zoom lens for aerial inspection of RF equipment and structural integrity inspections. I also developed the protocol for remote drone operators working in different regions for quality control.
UAV Water Quality Testing With Sweetly
After several years of experimenting with drones on my own, I decided to get my commercial UAS pilot license from the FAA. I began flying drones for television series and movies in NYC and Detroit. It’s good money but can be super stressful at times. Its a lot of travel and requires many skillsets. I loved it. Then fate fell into my lap. I was in Detroit attending a Hackathon and met a data scientist that specializes in water quality. Dr. Javad Roostaei had formed a small team of researchers at Wayne State University to connect IoT devices which monitored water quality in Lake Erie to a database which the public could access. I pitched the idea of using a drone and the rest is history.
AUVSI and SwissNext Conferences
I was fortunate enough to speak at both of these prestigious Unmanned Aerial Systems and Robotics conferences on behalf of Quantifly. I was also given the opportunity to host a career and education conference with several engineering firms in upstate New York. The program paired hosts from the tech industry with students from Syracuse University and Rochester Institute of Technology in order to facilitate discussion about privacy and security in a future with aerial robots and urban data collection.
The Rock - The Physiological Simulation of Dining in Zanzibar
I was hired by MCM Creative Studios in NYC to adapt a 6k 360 panorama of the famous Tanzanian restaurant The Rock for a unique dining experience at Spring Studios in Manhattan. Mastercard and Spring Studios wanted to bring the incredible dining experience of The Rock to New Yorkers by simulating the actual restaurant using projectors and a purpose-built stage. A near exact replica of the restaurant was built on a sound stage at Spring Studios. Projectors and highly reflective projection screens were placed outside the “windows” with a compass rose accurate depiction of their respective vantage points. The ultrahigh resolution video was recorded using RED cameras over a period of twelve hours so that as customers dined, they would see the sun move in the sky and fishing boats come and go. There were speakers placed throughout the restaurant which sounded the ambience recorded on location of the tropical environment. There were even machines which emitted the smell of the sea.
My job was to color manage and match all compass directions initially in the color lab and then oversee the matching and physiological response to the levels on site at the restaurant. It was a great example of differing subjective experience with luminance relative to the environment in which we’re in. Simulating a real life environment in person is not the same as simulating real life on the companded proscenium of an electronic display. Marrying the two presentations in a post pipleline requires a lot of math and a strong grasp of high dynamic range technology and workflows.
You can read more about this ambitious project in Forbes Online.
VAPOURSYNTH - An Enigmatic and Wondrous Open Source Image Processing Software
If you’ve ever worked in video restoration, you’ve likely noticed that most of the mainstream post production software fall short of providing a collection of tools which address all of the weird and wonderfully novel problems that exist. Does Resolve provide a tool to reduce ringing? Does Nuke offer a solution for dialing in macroblocking reduction down with a delicate finesse? The answer is no. Most people slap a Neat Video OFX plugin onto a node and call it a day. However, for those of us restoration folks who are obsessed with the integrity of the original image, nuance is critical. Enter Vapoursynth.
I was frustrated that there aren’t Resolve plugins available on the market for my specific needs in restoration so I decided to develop my own. I had a bit of software development experience from my consulting days at Quantifly so I called up an attorney who has experience in intellectual property within the software dev community to explore the idea. I sold some crypto and hired Vapoursynth’s creator, Frederik Mellbin, to adapt the software into a package compatible with Davinci Resolve’s OFX standard. I was then able to call up the Vapoursynth plugin within a node inside Fusion. This allows me to simply drop my custom Vapoursynth scripts into my pipeline thus bypassing the need to roundtrip out of and back into Resolve for my restoration work.
I am currently working on launching the plugin as open source to give back to my restoration community. Feel free to message me to learn how you can add Vapoursynth into your own image processing pipeline.
HQ Remote Color Sessions with Nobe Display + Louper
They say necessity is the mother of invention. When Covid hit and the entire post industry went remote, colorists and editors were scrambling to provide their clients with safe solutions which met some basic but important requirements for creative intent. There are companies on the block like Streambox which are already providing high quality WAN-based solutions for remote viewing. However, the initial buy-in was expensive and the monthly subscription was high. You also had to install a box on both ends for point to point streaming. I had some clients with a tight budget that didn’t want to deal with the box and pricing so we just went with Zoom. The Zoom image is TERRIBLE and not usable at all for color observation. Covid was also a strange time when clients didn’t want anyone coming into their home or office to do an installation.
After a couple of weeks of research, I discovered that there just isn’t a middle solution. Zoom was affordable but shite and Streambox was pricey. I decided to create an ad-hoc solution for remote color sessions and was pleasantly surprised at the quality. If you’ve ever used Twitch to watch live gaming, you’ve probably noticed the impressive image fidelity. Twitch offers high quality at high frame rates with low latency. Why is no one using this platform for remote color sessions? I just needed to validate the entire signal path for color accuracy using quantitative methods from end to end.
There is a small startup making genius apps for colorists called Time In Pixels. They created the False Colors app as well as the super popular OmniScope plugin for Resolve. They also make a clever program called Nobe Display. The program comes with a Resolve plugin which siphons off an NDI signal from Resolve’s color managed output and feeds it into your network. NDI is a new video transmission standard like SDI or HDMI but for a LAN. It’s designed as a simple and lightweight IP workflow ecosystem. It’s a lot like SMPTE 2110 in some high level ways but not as heavy duty. It was intended to live within a LAN rather than transmitted over custom high bandwidth WAN. Live streaming studios/events, churches, court houses, city councils, etc have been built around IP-based video transmission and file sharing so NDI is an exciting new standard for these markets. You can read more about this revolutionary technology on NewTek’s website. Most of their applications are available for free and the technology is an open standard. The reason I fell in love with NDI is because it was so easy to setup and transmit over my own office network. Once I ran the NOBE Display app and turned on the NDI transmission, I could access the live video feed from anywhere in my home and office using another device with a free NDI monitoring app. The video just appears over the network. Amazing!
OBS Studio has an option to receive this NDI feed as a separate input. Then you can setup the custom broadcast settings for the signal and send the live stream to the internet. Of course, nothing is quite that simple when it comes to color management. Initially, I wanted to keep my workflow entirely free (save for buying the Nobe Display plugin for $75.99 US). Although OBS Studio is free, there were some specific limitations regarding signal encoding which I found did not meet my standards. That’s when I discovered Louper. They’re a sexy startup which reminds me of the early days of Frame.io. They’re basically Twitch but with a client-facing portal and the ability to host webcams from clients as well as a live chat window optimized for post production. So now, I have a high quality stream going live to WAN with configurable and testable signal which is hosted within a professional client-friendly platform. I’m getting less than three seconds latency (I’m based in Brooklyn and I have Verizon FIOS with 1GB up/down, however I had similar latency over Spectrum in Venice, CA within a 35mbps up/down pipe). I don’t have to use a separate machine for the broadcast. I’m using the same machine that I’m coloring with. I use a hardline Ethernet cable direct to my FIOS router. The results are blowing my mind. I’m only paying a few bucks a month for Louper subscription.
I’ve created a diagram below to show my workflow. Note that the signal can be probed at pretty much any point along the path for validation of accuracy. I have yet to verify with Louper what exactly is happening with the signal within their own black box, however, they have been very communicative with me and are excited about the prospect of this becoming another use case for their product. Time in Pixels has also been very helpful, providing technical details and feedback in my workflow design.
SkyRig - A Custom POV Camera Rig for Cinema
I partnered with AST Studios in NYC and filmmaker Andy Fortenbacher to design and build a custom POV camera system called the SkyRig. I had been hired as the cinematographer for the ad campaign for wireless earbud startup SkyBuds and the client wanted a POV experience. I was reluctant to use a GoPro since these systems fall well short of the cinematic quality we were going for. The BlackMagic Pocket Cinema Camera HD had just launched at the time and I believed it was the best quality-to-weight ratio for the system. Below are a few of the technical specifications that I drew up for the initial design:
Helmet
Adam Teninbaum from AST Studios ordered a rigid skydiving helmet which was designed for camera mounting, primarily GoPros for extreme sports. The helmet was constructed of rigid plastic that was 1/8th” thick which we knew we could build from. The rigid surface covered the entire sides, top, and back of the operator’s head so there were many mounting options. This would also protect the operator’s noggin during operation since visibility/situational awareness was reduced.
Camera Mount
I wanted a solid platform with a standard 3/4” screw mount which would remain rigid and safely in place during a strenuous and kinetic shoot. Our camera operator (and hand/arm model) was going to be doing everything from riding a bicycle in Manhattan traffic to riding a giant swing at a carnival. We ended up using Wooden Camera accessories since they were solid aircraft-grade aluminum which would remain rigid but not break the operator’s neck! I also requested rosettes so that once the wing nuts were tightened down, there would be no slippage. The discretized nature of the rosettes also provided a way to align the two primary mounting points on either side of the helmet.
Power Plant
One important feature that I believed was critical from the get go was a central power source which had the juice to power the entire system from a single battery but would not require having to constantly swap for fresh bricks. When the operator was in the aerial carnival ride shooting a POV, I didn’t want him to run out of power in the middle of the shot. I also didn’t want to have to mount multiple battery types using various plates and cables onto the same rig. I wanted one single power source which could also act as a counterbalance to the weight of the rest of the rig. We ended up using large V-lock batteries from IDX which slid and locked into a female V-mount plate hard mounted onto the back of the helmet. We then mounted a D-Tap 12-volt distributor which could route various 12-volt rails to the camera and accessories.
Teradek Remote HD Monitoring
It was critical for our director Ross Thomas to be able to see a live HD feed from the camera rig at all times. So we decided to go big and mounted a Teradek Pro HD wireless video transmitter to the top of the helmet. I ordered a portable handheld director’s monitor with a Teradek receiver so that Ross could walk around and talk with the talent all while seeing a live HD signal from the camera. We also had another receiver mounted to a larger HD monitor for video village. This way the other departments could keep eyes onto the image.
Remote Follow Focus
Since we were using cinema glass and focusing on objects in the foreground as well as deep background, I needed my camera assistant to have the ability to pull focus without touching the rig. I wanted to open my iris enough to soften the background so Andy Fortenbacher designed a tiny rail system which his Arri remote follow focus could be mounted. The system needed to be responsive down to tenths of a second so we had to do some testing.
Special Thanks
Although I designed the initial layout for the system as per the requirements for the job, Andy Fortenbacher and Adam Teninbaum custom ordered all of the parts and supervised the machining when needed. It was a truly collaborative technical project which addressed a need for a system which was unavailable on the market at the time. There were one or two other companies which had built similar systems but fell well short of the production value required to meet our needs.
You can watch the final commercial for SkyBuds here:
RALPH LAUREN VIRTUAL RUNWAY
I was hired by Splashlight Studios in NYC as the Technical Director and Visual Effects Supervisor for Ralph Lauren’s virtual runway campaign which premiered at NYC’s Fashion Week and the New York Time. The video was also screened in Times Square on the Jumbotron! I designed and built a chroma key stage as well as a modified chroma key treadmill for the models’ virtual catwalk. We used a motion control camera system to repeat steady camera movements around the models which were then tracked using tracking markers on the studio wall in the background to marry with the virtual background in 3-D space. We recorded the z-space vectors and imported them into Maya to create a rough keyframe for the movement which later had to be hand-refined in Maya. I interfaced with Ralph Lauren’s creative team to ensure we met their branding guide throughout the entire process.
The project required a large VFX team of rotoscoping artists, compositors, 3D asset artists, graphic artists, motion control technicians, camera operators, lighting and grip technicians, and an army of assistants. It was the largest technical project I’ve ever supervised for the film industry and I enjoyed it immensely.
Skillshare Course - 3,968 Students and Counting!
I co-created and co-hosted a Skillshare course with documentary filmmaker Danya Abt called How to Make a Character Documentary. We wrote the scripts, lesson plans, and supported our students in their coursework. This project was a ton of work but a great introduction to creating and teaching an online course with a diverse set of instructions on how to create a film from scratch. I loved making the course and I plan on creating and hosting more in the near future.
Lenny Cooke - A VFX Experiment in Time Travel
I was hired by AST Studios and once again had the opportunity to collaborate with VFX artist and VFX supervisor Adam Teninbaum for Bennie and Josh Sadie’s documentary Lenny Cooke. The director duo wanted to place the contemporary Lenny Cooke into a segment of miniDV archival in a scene in which he is having a conversation with his younger self. We shot the sequence of Lenny on a chroma green stage in Brooklyn and I used over sixty thousand watts of lighting to simulate the hard sun and a further ten thousand watts of light color corrected to 9000 kelvin to simulate the blue spill of the sky. Although we used a RED camera to capture Lenny in detail, the image was decimated to marry it with the miniDV archival.