Virtual Sydney meeting day 4

There were only two technical sessions in the ISO meeting today, as a spot opened up during the agenda planning meeting, and it was decided to put it at the end of today, so we could finish early. The day that is – we have one more day of meeting tomorrow.

Today’s first session was on autofocus, measuring the speed and the accuracy of camera autofocus mechanisms. The project leader has been experimenting with methods to measure these things. It’s not as straightforward as you might think. We care about digital SLR cameras, where you have manual control over things like triggering the autofocus, but most cameras these days are also phones, and we have to be able to measure those too. When was the last time you manually triggered autofocus on your phone?

Right… you don’t need to, because it’s continuously refocusing. So you can’t just place a phone camera in front of a test calibration chart, defocus it and then time how long it takes to focus and repeat 100 times to get good statistics, because there’s no way to force it to defocus. The only way to change the focus is to point the camera at something a different distance away. So the project leader was running experiments where the camera would be pointed at a distance wall, and a mechanical arm would swing in a test chart at a near distance, thus forcing the camera to refocus, and then take a photo, and then measure the photo image of the test chart to see how good the focus is – and repeat hundreds of times.

He reported that while this method seemed like it should work, there was a problem. The camera typically refocuses in a fraction of a second and then takes the photo, which we should be able to analyse for any defocus blur to see how good the autofocus is. The problem is that by the time the camera takes the photo, the test chart is still vibrating from the sudden movement into the field of view… so the photo has significant (several pixels) of motion blur in it! This makes it very hard to figure out the defocus blur. So he wasn’t sure what to do about this. I suggested changing the experimental setup to have the nearby chart fixed, but to put an angled mirror between it and the camera, which would reflect the image of a distant wall into the camera. So now the camera can focus on the wall, and the mirror can be removed quickly, forcing the camera to refocus on the near chart, which hasn’t moved – no vibration! He said that was a good idea, and he’ll try it out. There were a bunch of other technical details reported as well, which I won’t go into further.

The second session was about measuring the accuracy of depth cameras – which produce images telling you the distance to points in the scene. This is a preliminary exploratory stage of what will be a new standard. The main difficulty here is that there are several very different technical approaches to making a depth camera, and a test method that will work for one of them won’t work for another. So we’re compiling a survey of what we want to measure and how we can do it for all the different types of camera. We seem to have put together an agreed list of things, and the project leader is planning to write up a first draft in time for comments and discussion at the next meeting.

This afternoon I started planning for my market stall on Sunday. I’ve done three markets at a small local suburban market last year, but this one is a bigger market in the inner city, with many more stalls, and hopefully many more customers. It’s going to be a jump up in complexity and experience level, and I have to figure out how to get all my stock and gear there in our car with my wife’s help, without hiring a larger car to put it all in. This market is much closer to home, so we can make two trips, which after test packing of the car today I’m pretty sure we can manage. I was a bit worried about one of the items being too big to fit, but turning it a certain way I managed to get it into the car boot and close the door.

I’m going to be having early mornings both Saturday and Sunday this weekend… for the final day of the ISO meeting, and then getting up super early on Sunday to haul gear to the market and be set up and ready to go before 8:30!

New content today:

6 thoughts on “Virtual Sydney meeting day 4”

  1. “So now the camera can focus on the wall, and the mirror can be removed quickly, forcing the camera to refocus on the near chart, which hasn’t moved – no vibration!”

    So, why can’t you just *remove* the test chart and measure how the camera refocuses on the wall behind it? No vibration. Why can’t you use an electrochromic material between the distant wall and camera as your near object? Camera focuses on distant wall. Throw a switch: now there’s an electrochromic test chart in between, with zero vibration! Throw it the other way: now there isn’t!

    1. Removing the chart to force focus from near to far is also part of the test, but it also has to be tested going the other way. Performance can vary between the two directions, so you need to measure both.

      Maybe we could use an electrochromic chart… we hadn’t thought of that. It would need to be built and tested. It’s also more complicated and expensive. Given we’re developing standards to be accessible and usable as broadly as possible that’s an important factor. We shouldn’t specify measuring equipment that is prohibitively complex/expensive.

  2. Electrochromic materials aren’t absurdly expensive these days. My quick search just now finds a 15cm square element costs USD $25.

    Sorry, I’m a science fiction fan. We’re trained to treat all technical challenges as puzzles we’re supposed to solve.

    1. Well, there are other issues too. The test chart needs to be a certain contrast level and resolution. It pretty much needs to be *at least* 300dpi, and it has to have high contrast similar to a photographically reproduced test target. I don’t know if electrochromic materials can meet either of those requirements.

  3. Are you familiar with how haunted houses make things appear and disappear, using partially-one-way mirrors and sudden changes in lighting? I wonder if you could get the autofocus to work using a trick like that: an angled mirror like you described, but instead of moving the mirror, you simultaneously shut off the light in the perpendicular corridor and turn off the light in the parallel corridor or vice versa.

    1. Hmmm, potentially that could work, I suppose. There may be issues with rapidly switching lights on and off, and time lags of various sorts. The rig could end up quite complicated too.

Leave a Reply

Your email address will not be published. Required fields are marked *