KenGS wrote: ↑
Tue Sep 29, 2020 3:15 am
There are three sets of numbers involved in the imaging system. First is the well depth which is how many electrons each pixel can store - 16000 in this case.
Then there is the Analogue to Digital Converter (ADC) bit depth which for the ASI533 is 14 bits which therefore has a maximum value of 16383. The output of the ADC is the number of electrons in the pixel divided by the gain (in e-/ADU) plus the offset and strictly speaking this is the ADU value. However, the driver usually normalizes the ADU value to a 16 bit number by multiplying by 4 in the case of a 14 bit ADC. This gives rise to the 65000 value you are seeing as 16 bits has a maximum value of 65535. This is what the imaging software sees and is also called the ADU value.
In terms of how long you should expose for, that depends on what you are trying to achieve. But assuming you are going to be stacking then you want the subs to be shot noise dominant rather than read noise dominant for the stacking to be most effective at reducing noise.That's where the various spreadsheets come in. You want the sky background to be 3 to 10 times the read noise squared which, depending on the gain, could cause some stars to saturate. If that is not desirable the first option is to reduce the gain and recalculate the exposure time.
There are two places where saturation can occur. First is on the pixel tself when it captures more electrons than it can store. The other is in the ADC when the gain is above unity so that when the electrons on the pixel are multiplied by gain, they exceed the maximum value of the ADC. This limits how much of the well depth of the pixel is usable.
Hi Ken and thanks for the explanation. You answered some questions that I had, but couldn’t word very well. One was normalizing the ADU by the driver going from 14 bits to 16 bits. Makes sense for sure and I had read somewhere that the conversion was going on, but I didn’t know where. Thats where the 65000 single pixel star ADU comes from. Thanks.
I played around with the spreadsheet that JT sent me and using the APT
Pixel Aid, I entered values from the rocky session the other night. Using the single pixel mode to get a rough idea of the sky glow background from a light frame (6000 ADU), dark signal from a gain mismatched dark frame (2800 ADU), and bias signal from a dark flat frame (2800 ADU), with an exposure duration of 120 seconds. The spreadsheet calculates my sub exposure time at a gain of 100 to be 0.23 sec with an N of 1022. Clearly I am way off somewhere, but that is another issue.
Is the histogram useful in any way to get a rough idea of exposure time? I know that is very rough, but in the image in the above post, the histogram range is roughly 4000-65000 and the peaks are in the first 1/4 of the left side of the graph. To me, the range shows that the sky background to be around 4000 ADU and the saturated stars to be 65000 ADU.
I know that this is a super simplistic approach, but the spreadsheet has me a bit twisted.
Thanks again for the informative response!
Scopes: Explore Scientific ED102 Triplet APO, Celestron Nexstar 130 SLT.
Mounts: Celestron AVX with Orion MM Autoguider, SLT;
Binoculars: Bushnell 10X50
Stuff: ASI EAF Focus Motor, Stellarview FF/FR
Camera / Software: ASI 533 mc pro, Nikon D5300 (Ha mod), IDAS LPS D-1, Optolong L-Enhance, Astrophotography Tool, PHD2, SharpCap v3.2, StarTools 1.7.4xx alpha, Adobe Photoshop CC
Sky: Bortle 7-8
Astro Photos https://flickr.com/photos/157183480@N07 ... 7681236785