Wednesday, 13 August 2008

Epson flatbed filmscanners

I've got a couple of these scanners (4870 and 3200) which I use for scanning 4x5 film and more recently 6x9 and 6x12 120 film. While they're quite good up to (somewhere around) 2400 dpi, I don't think there would be many people argue that they go beyond this the wise (in my opinion) are content with just 2400 or less.

A common pursuit is to adjust the height of the film holders to obtain better focus, and I've certainly tried that too. In fact its from making many investigations myself that I've come to the (more or less final) conclusion that while there may be gains in focus to be had, that there will be other problems introduced by this.

My theory is that the location of the film plane is a combination of two separate functions. One is of course focus, the other is to do with the physical location of the sensors.

the problem is created by the fact that film being scanned is not the same width as the sensor and that the light from the light source of the scanner is therefore always at 90 degrees to the sensor.

The figure at left here attempts to put this graphically. Here you can see that the scan head is not the same width as the film being scanned (when scanning at the full width of the scanners film capacity). Essentially the scanner is making some optical reduction of the film image onto the scanner 'head' surface. Since the array of receptors on the scan head (a bar that travels along the of the film and creates an image by sampling the line in a series of steps) is very dense this 'reduction' doesn't really matter.

Its a method of allowing a shorter sensor to scan a wider area. So, what does this mean?

Well, it means that

Epson flatbed film scanners (why improving them is not as simple as it looks)

I've got a couple of these scanners (4870 and 3200) which I use for scanning 4x5 film and more recently 6x9 and 6x12 120 film. While they're quite good up to (somewhere around) 2400 dpi, I don't think there would be many people argue that they go beyond this. The wise (in my opinion) are content with 2400 or less.

An area which is hotly contested is the benefits of adjustment the height of the film holders to obtain better focus, of course I've tried that too. I had performed some careful measurements and found that a small raise in my film holder position improved my focus. Importantly (we'll see why in a sec) I mainly use black and white 4x5 negative, with the occasional use of colour.

It was when scanning a 4x5 colour slide recently that I made an interesting discovery and I've come to the (more or less final) conclusion that while there may be gains to be had in focus, but that there will be other problems introduced by this.

Firstly I discovered this issue by observing a scan of this image which I took earlier this year in the beginning of spring. Its taken with my 4x5 camera, with a coated Fujinon 90mm f8 lens. The exposure was made on Fuji Provia RDP-III.

Looking around the image I noticed that there was some strong chromatic aberration appearing on the scans when magnified as you can see to the left here. This is what I was seeing when magnifying a 1200dpi scan (it requires less magnification when scanning at 1600, 2400 or 3200). Certainly this was sticking out like "prawns eyes" because of the high contrast background of snow.

My first thought was "shit, is my lens this bad?"

So I whipped out my x10 loupe and couldn't see a trace of it on the film. I split the channels to see what was there for each of the R G and B bits. And well I found something I was not expecting.

There is distinct spatial movement between the pixels for R G and B!


Above is an animated GIF of each of the colour channels separated. If you wish to look at one of your own images more easily, try cycling through the colour channels (in Photoshop for instance, just press Ctl and cycle through the numbers 1 2 3 on your keyboard).

Looking around the scan I found also, that the effect was not evenly distributed, with left side migrating left to the outside, the center seems to just grow mildly around the axis, and with the right side moving to the right.




What is causing this?

Well, really I don't know, but my working theory is that the difficulty is created by two physical aspects or the scanner;
  1. the scanner head is actually not exactly the same size as the image that is scanned, meaning that light from the film travels at different angles to the sensor (see figure to left)
  2. the sensor is a evenly spaced repeating array of RGB points.
The scanner has in fact an array of 3 distinct sensors placed in separate (although adjacent) locations a red green and blue sensor. So, the data from these adjacent physical positions must then be mapped in single physical space to create a single RGB pixel. This is like the demosiacing that is done on bayer arrays on digital cameras.

Now the next part of the equation comes from the fact that light from the light source is not parallel to the scan head. This creates a complex problem for remapping these as pixels because if you alter the distance between the film and the sensor, the angle will alter at the edges. So while you might be able to perhaps improve the focus you will alter the physical angles, and thus the physical locations of where R G and B are.

I'd have thought that Epson would have accounted for this in their spatial remapping, but they must have decided that the calculation was too complex. I scanned the following to demonstrate just how much angle we're talking about here. Notice how the inside of these knobs (three identical 3-dimensional objects) shows that the scanner is viewing them from the side on the edge of the scan area, and directly below in the center.

Normally you'd not see this when scanning a single dimensional (flat) object like a piece of paper. I think its pretty obvious here that the angle of view changes at varying angles as the distance from the center is increased, not to mention that the effect is different in different locations NB its not uniform.

Meaning that its not easy to just fix by nudging pixels, because if you then need to nudge left side differently to right side, and alter the amount as you move outward in the image. Worse, if you then change the height of the object from the standard location (where prehaps the design is intended to converge the points) to any 'focus sweet spot' then you'll be increasing your aberration effects.

I actually have made adjustment on my focus with stand-off's attached under the film holders to raise it. I tried removing them and found that focus was worse generally, but the above RGB effect was worse in some areas, while better in other areas ... you win some you loose some.

Lucky for me none of this applies when only using black and white. I've found that putting the scanner into black and white mode uses only the green layer, so no remapping of RGB pixels locations is not needed. So this scanner makes a nice well priced black and white film scanner!

But in terms of post construction adjustments to attempt to make this scanner better it seems to me that each individual case needs to be examined carefully for focus AND this RGB effect. Perhaps we're stuck with the system as it is for colour and that altering film height for finding focus sweet spot remains mainly a benefit for black and white film.

Anyway, I would love to hear from owners of later models (like the V750) to see if this issues remains there too!


Tuesday, 12 August 2008

Provia v 160ProS

Comparisons are often sought between slide and negative film perhaps especially for landscape photographers. Personally for quite some years I'd mainly tended towards slide for making my "good work" and used negative for snapshots. Just recently I decided to stop speculating and just load both into my film holders and toddle out to see what I could see. I went to a local waterfall and took this image set.

I decided to make the experiment "even" by having the exposures set the same, and alter nothing except which film I exposed. For interest the lens was a Fujinon 90mm f8 and I set it to f16 for the images. I exposed both on the same shutter speed (which means that the negative got a little more exposure than rated as its 160 and the Provia is 100). The Provia looked beautiful as soon as I saw it at the lab (although I as I expected the shadow in the lower left was annoying). The day was generally fine with some fast moving small clouds (I'm on the edge of a plateau here with the altitude being 900Meters or so).

Anyway, to the images, clearly the two scan differently, as you can see from the image at left. I scanned both with my scanner software (epsons scanning software) with all settings to linear both in slide and negative. Clearly the scanner is doing some mask removal there for me). Anwyay, after setting black and white points, I've left them basically as they came out of the scanner.

Now, looking at the images to the left the first thing that strikes (and struk me at the time) was that there is some differences in colour cast between them. Given the nature of the two media (slide being clear and negative having many and varied shades of orange mask to remove) I think they're remarkably similar.

Below is a detail segment screen snapshot of an area in direct sunlight, with some (not very deep) shadows to see if there was much difference.

Looking at these images on my scanner when I first scanned them I found that while there was grain visible on the negative, it seemed sharper to me. I was certainly not expecting that! So I scanned the provia again and it was the same. I've put two segments below. Check out the definition of the railing and the tree tops against the sky.

First the overview showing where the sample was taken:

then the samples:

Provia 1200 dpi segment

Pro160S 1200dpi segment

Now, what's going on here? I don't know, but I'd sure like to. So I started looking at other scans I'd made on this and other scanners (also looking at 35mm film which has sharper lenses {in theory} per mm of film ) and I found the same "look" here appearing on all of the slides. A sort of softness when at the limits of the scanners ability. I didn't have other side by sides to make clear cut comparisons, but it seemed to me that this same 'difference' above was appearing in my images.

So, looking at other images I've scanned it seems to back up this finding.

So this leads me to ask why?

I've got a few theories which I'm tossing around, but I don't know why the negative just looks sharper! One thing which comes to mind is something called the callier effect. Now, normally this is something which is more associated with enlargers. My thoughts are something like this. Perhaps there is something causing scattering of the light which is greater in the slide than in the negative. If this was so it may look like this. Also this would be perhaps observably greater with flatbeds (where the sensor is further away and influenced more from side sourced scatter light) than on a drum scanner (where there is only one sensor)? I sure don't know but its an interesting observation for me. When I get the chance I'd love to get both these drum scanned and make the comparison that way.

Anyway, the result of this (and me looking at my scans) was enough to tip me more towards using negative than positive from now on.

Moving onto the 'colour rendition' I thought that I'd put up my digital reference. This was taken with a 10D using RAW, and was the light metering that I used for all the exposures. The image below is a straight conversion from the RAW with no gamma or curves applied.

At first it looks to me that its between both of these. Sadly there was not a cloud in sight on this image, but forgetting the sky, the rocks water and greens of the foliage look more like the negative to me than the slide. Your interpretation (and monitor) may vary. I've placed below a version of the above with a bit more contrast added.

So, three photographs, and three colour versions. People say that the DSLR's are very neurtral in colour responce, so if that's true then both the films above are "out" by some margin, so this means if you're using a slide as a colour reference for your negative you'd better have a well colour balanced light table to view it, and have your monitor set up well for your photoshop work.

Personally I feel that from here it all comes down to taste, this is after all a pleasing art rather than the scientific application of colorimetry.


Friday, 8 August 2008

are Nikon crazy?

I was just reading of the release of Nikon's newest Coolpix camera the P6000 on DPReview. What a disappointment, it leaves me wondering if Nikon are going nuts, or just punishing themsleves? I mean really, from the company which brought out such amazing prosumer cameras in the past, so much R&D clearly invested in this camera yet they cripple it with a tiny sensor.

Now, its true that many folk wouldn't know what the difference between a sensor and lens coating was, but then folks who buy cameras like the compact S series normally do.

Nikon are trying to market this camera in the "prosumer" market, and indeed this camera comes from a long line of excellent cameras like the 950, 990, 5000 Coolpix cameras which were highly awarded and recognized as being top quality photographic tools back in a time when digital had to struggle to be taken seriously.

These early cameras provided useful features to the semi-professional and advanced amateur user like; Nikon Flash system integration, fully controllable exposure, manual focus control, attachment to accessories such as remote shutter releases with interval capture controls and Microscope adapters, the 5000 even introduced hot shoe compatibility with the Nikon speedlight system.

These were serious photographic tools aimed at professionals or advanced photographic users who have an understanding of photography or have specific application for the camera.

From the first coolpix 900 the series improved with every release, until the 5000, from which point Nikon have not only dropped the ball, they seem to have given up the game entirely.

For example, lets look at the sensors (the most critical part of the camera aside from the lens). The size of the photo sensor on the sensor chip is critical to giving a good result. The smaller each "pixel" is the less light it can recieve, and therefore the worse the quality of signal to noise.

Looking at the graph to the left, you can see that sensor sizes have remained reasonably steady until the CP5000, but the pixel size in this new model is about 1/4 of the size of the sensors in the older models.

Why is this so?

Perhaps because they are just cutting costs by making smaller and smaller sensors in the camera and then just subdividing that sensor into more pixels. This is probably because its cheaper to etch more sensors into a smaller chip than to just give you a bigger chip.

To make it clearer what I mean, I've drawn this phenomenon to scale here. You can see that with previous models they've increased the sensor chip size while they've increased the number of pixels, thus the pixel size in the above diagram remained more or less constant until the time of the Coolpix 5000.

However, you can see here that the latest offering is a reduced sensor chip size with a massively increased megapixel value.

Will this effect images?

Most certainly it will!

For example on my page "megapixel madness" I compare my old 1999 Coolpix with a later model Canon which has a sensor size about the same size as the Coolpix 6000 above (which I'm saying will not be up to the standard of the older Coolpix cameras). I could take relatively long night exposures like this:

long exposure on a bigger pixel sensor

which were very clear and tidy images. Have a look at the differences below, the image from the 1999 camera is much clearer than the segment from the small sensor camera.

1999 Coolpix 950 camera
segment Note how clear the dark areas are around the bushes ... apart from what appears up in the sky (where there is only blue signal) there is almost no noise at all. This was also an 8 second night time exposure, with the camera pushed to its highest ISO (320 in this case).

2005 model small sensor camera
segment scaled Yet this cameras image sucks, obvious peppery looking grain everywhere, and colour blotchieness. Despite it being a much shorter exposure time and using a much lower sensitivity of 100ISO.

There is every reason to expect that hands on testing of the new Nikon camera will be just as bad as the small sensor image above. Nikon seems to be happy to offer big sensors in their bottom range DSLR (like the D40) but offer up small chips in similarly priced compacts like their 6000 and call it "prosumer".

Frankly its an insult to any experienced camera user and no professional would be interested in touching it. It seems that Nikon have accepted that Canon's G9 is the serious choice and they've just given up.

The light at the end of the tunnel seems to be coming from Olympus / Panasonic, with their announcement of a new series of compact cameras based around the bigger 4/3 chips. Very exciting. If these cameras can match the image quality offered by the Sigma DP-1 (which uses a bigger sensor chip too. This camera (despite being slammed by some critics) seems to be proving that it can make professionals (who also use DSLR's) happy with its image quality as found for instance here.

So, I hope that Olyumpus makes a good go of their 4/3 sensor Micro cameras and make a killing in what was once the market in digital cameras, and arguably the market which brought digital cameras to prominence, the prosumer compact camera.

references and notes

Clarke R.N., Does Pixel Size Matter,

Clarke R.N., Digital Signal to Noise,

DPReview, Compact Camera High ISO modes,
DPReview, Noise,

Farrel J., Xiao, F., Kauvusi, J., Resolution and Light Sensitivity Tradeoff with Pixel Size,