Go to a  "printer friendly" view of this message which allow an easy print Printer-friendly copy Go to the page which allows you to send this topic link and a message to a friend Email this topic to a friend
Forums Lobby GET TO KNOW YOUR CAMERA & MASTER IT Nikon D7100, D7000 (Public) topic #4728
View in linear mode

Subject: "d40 vs d90 vs d7000 sensor question" Previous topic | Next topic
Rick_Smith Registered since 08th Jan 2011Fri 14-Jan-11 03:01 PM
88 posts Share on Facebook Share on Twitter Share on Linkedin    Click to send email to this author Click to send private message to this authorClick to view this author's profile
"d40 vs d90 vs d7000 sensor question"


US
          

I see that the sensor in all 3 of these cameras is different, with the biggest difference seemingly coming from the D90 to the D7000.

It looks as if the sensor on the D7000 can handle almost 3 times the amount of colors then the D90 and D40. I just wanted to see if anyone had any idea how this translated as far as real world picture taking / image quality is concerned.

Thanks.

  

Alert Printer-friendly copy | Reply | Reply with quote | Top

Replies to this topic

JosephK Silver Member Fellow Ribbon awarded for his excellent and frequent contributions and sharing his in-depth knowledge and experience with the community in the Nikonians spirit. Nikonian since 17th Apr 2006Sat 15-Jan-11 02:07 AM
4108 posts Share on Facebook Share on Twitter Share on Linkedin    Click to send email to this author Click to send private message to this authorClick to view this author's profile
#1. "RE: d40 vs d90 vs d7000 sensor question"
In response to Reply # 0


Seattle, WA, US
          

I am not quite sure what you mean by "almost 3 times the amount of colors".

---------+---------+---------+---------+
Joseph K
Seattle, WA, USA

D200, 17-55mm f/2.8 DX, 70-200mm f/2.8 VR, 50mm f/1.4 D
18-70mm f/3.5-4.5 DX, 70-300mm f/4-5.6 ED, D70S

  

Alert Printer-friendly copy | Reply | Reply with quote | Top

luckyphoto Silver Member Nikonian since 27th Dec 2010Sun 16-Jan-11 02:03 PM
739 posts Share on Facebook Share on Twitter Share on Linkedin    Click to send email to this author Click to send private message to this authorClick to view this author's profile
#2. "RE: d40 vs d90 vs d7000 sensor question"
In response to Reply # 0


Port Charlotte, US
          

I believe you're referring to the 14-bit versus 12-bit color depth of the D7000 (and by the way, it's a whole bunch more than 3 times the color). Here's an article that talks about and shows 12-bit and 14-bit images side by side. Google 14-bit color. There are other excellent articles on 14-bit color.

http://photomatter.com/Reviews/NikonD300d.html

Larry

"Red is gray and yellow white, but we decide which is right
....and which is an illusion"

Moody Blues - Nights in White Satin

Visit my Nikonians gallery.

  

Alert Printer-friendly copy | Reply | Reply with quote | Top

cochrun Registered since 16th Jan 2011Wed 26-Jan-11 01:47 PM
40 posts Share on Facebook Share on Twitter Share on Linkedin    Click to send email to this author Click to send private message to this authorClick to view this author's profile
#3. "RE: d40 vs d90 vs d7000 sensor question"
In response to Reply # 0


Parker, US
          

I'm not sure that I can explain this well or that I should even try but...
First, imagine going in the opposie direction - fewer bit rather than more. Take it to an extreme. Imagine that you have only 2 bits to play with. What would your image look like? Not very impressive is it? So I think we can agree that more bits are better than fewer. (You can do this in most software packages)

It is also important to note that the human eye has limited capability to distinguish between differing intensities and colors. Most cameras can capture way more information than the human eye can "see". So, you don't need to display all of the information you have because your viewer can't see it anyway. So why more bits?

Now the question becomes, is there a point at which adding more bits does not improve my image? My answer is that it depends! No surprise there as that seems to be the answer to a lot of questions.

If you are always taking photos of brightly lit scenes you may never see the difference. If there is always a lot of contrast in your images you may never see the difference. But suppose you photograph a subject that has very little contrast. A histogram would show a flat line with a narrow spike. The spike represents all of the information in your image. If you have low dynamic range (fewer bits) then all of your low contrast image may be represented by a single value - a spike in the histogram that has no slope or bell curve on the edges. Now increase the dynamic range (more bits) and take the same image. Without post-processing it may look the same. But now stretch the histogram and you will find that your narrow spike now has some width to it. It may actually present some shape other than just a sharp verticle spike. With the increased dynamic range you have been able to capture the subtle differences between very nearly identical colors or intensities. By spreading the histogram you can now display these subtle differences where the human eye can see them.

You might ask where this becomes important. I submit that any time you are photographing images where the information is concentrated in a very narrow range of values such as an image of a distant galaxy that this becomes very important.

Perhaps someone can explain this better than I can. If I have made wrong assumptions (very possible) please point them out.

As to the link to the article that showed photos of the egg and golf ball, both images were showing a very wide range from black to white. If you represented all black through medium grey by one bit and used the remaining bits to display the upper half of the histogram I think you might begin to see the difference betwen the two sensors.

Dave... KD0IRS QRZ...
Castle Rock & Pacific; www.craprail.com - Parker, CO
s/v Indigo Moon; www.flickr.com/photos/cochrun/ - Kemah, TX

  

Alert Printer-friendly copy | Reply | Reply with quote | Top

    
Crowndog Silver Member Nikonian since 17th Jun 2011Tue 21-Jun-11 06:29 PM
4 posts Share on Facebook Share on Twitter Share on Linkedin    Click to send email to this author Click to send private message to this authorClick to view this author's profile
#4. "RE: d40 vs d90 vs d7000 sensor question"
In response to Reply # 3
Tue 21-Jun-11 06:31 PM by Crowndog

US
          

If I may chime in. In the perfect world if we had a perfect lens with a transfer characteristic that "what went in came out" we might have actual 14 bit information depth. This would be important in cases were we might play with color temperature and such and edited away at the color saturation and hue levels with our software. In other words we would need the data in order to explore those realms. However, in the REAL world our lenses are far from perfect but are getting better all the time. So, we might assume that behold Nikon comes out with a AF-RDCTP (Real Damn Close To Perfect) transfer function. With all that being said these camera bodies are not just used with our lenses. In medical and labratory use (not always with lenses by the way)and (ssshhh D.O.D.) filtering softwaare, analysis software will use this info, trust me. LIke pulling information from beneath cloud cover in a photo from far far away. To sum up, there are those that would wish for 16 or 18 bit depth.

Visit my Nikonians gallery.

  

Alert Printer-friendly copy | Reply | Reply with quote | Top

Forums Lobby GET TO KNOW YOUR CAMERA & MASTER IT Nikon D7100, D7000 (Public) topic #4728 Previous topic | Next topic


Take the Nikonians Tour and learn more about being a Nikonian Wiki /FAQ /Help Listen to our MP3 photography radio channels Find anything on Nikon and imaging technology - fast!

Copyright © Nikonians 2000, 2014
All Rights Reserved

Nikonians®, NikoScope® and NikoniansAcademy™ are trademarks owned by Nikonians.org.
Nikon®, Nikonos® and Nikkor® are registered trademarks of Nikon Corporation.