Commit 3ccbc979 authored by Mohammad Akhlaghi's avatar Mohammad Akhlaghi

Segmentation maps from degraded HST image

Until now, the segmentation maps were created from the original
resolution HST image, then warped to the grid of MUSE. However, the
mixed pixels that were created would bias the position measurements of
the objects, so now, we are using NoiseChisel on the degraded HST
image. But since there is practically no noise left in those images,
we use very lax NoiseChisel parameters. Also, the Sky value of the HST
images is subtracted before convolution.
parent 7b6f75d4
......@@ -24,72 +24,56 @@
# One pixel kernel
# ----------------
# This kernel is created since in practice it means no convolution
# with NoiseChisel.
segdir = $(BDIR)/seg-maps
onepkernel = $(segdir)/one-pix-kernel.fits
$(onepkernel): | $(segdir)
echo "1" > $(segdir)/one-pix-kernel.txt
astconvertt $(segdir)/one-pix-kernel.txt -o$@
rm $(segdir)/one-pix-kernel.txt
# Create the segmentation map
# ---------------------------
# The first thing we need to do is to create a segmentation map that
# will be fed into Gnuastro's MakeCatalog to generate the final
# catalog. Unfortunately as it currently stands, the MUSE-generated
# broad-band image has too many artifacts. So if we want to do
# detection over it, we are going to miss a lot of the fainter
# objects. The convolved and scaled HST image also has practically no
# noise left (becase of the huge kernel it was convolved with).
# Thus, the only image we can use is the initial HST image. We will
# run NoiseChisel over it, resample the object labels to the MUSE
# pixel grid, remove all pixels with a floating point (where the
# object did not fully cover a resampled pixel) and use that as the
# object labels to generate a catalog.
# Some pixels on the circumference of the resampled labels will still
# remain with the wrong label, but they are fixed both for HST and
# MUSE images and since they are on the circumference, they do not
# have any major flux in them. Ideally these segmentation maps should
# also be convolved with the MUSE PSF, but we aren't doing that here
# since it is mostly irrelevant for this study as discussed below:
# broad-band image has too many artifacts at low surface
# brightnesses. So if we want to do detection over it, we are going to
# miss a lot of the fainter objects. Thus, the convolved and scaled
# HST image is our only choice.
# In the end, the exact object detection is not a major issue here, we
# want a fixed set of pixels to measure/calibrate the magnitudes, so
# the issues above cause no harm in our analysis. By using a higher
# resolution, we do miss the wings of large objects, but on the other
# hand, we are able to check the calibrarion of fainter objects that
# are completely lost in the lower resolution.
# The cutouts and catalogs will end with `-m.fits' or `-h.fits' for
# MUSE and HST data respectively. To make the dependency management
# easier, we will follow the same convention here, but since they are
# identical in the case of segmentation maps, we will use symbolic
# links. The actual segmentation map will be the `-h.fits' file, but
# the `-m.fits' file will be a link to it.
segdir = $(BDIR)/seg-maps
# However, the HST image doesn't have much noise left (because of the
# huge kernel). So we will be convolving it with a 1 pixel kernel
# (effectively no convolution), and then using much more looser
# NoiseChisel parameters to give a reasonable result.
segments = $(foreach uid, 1 2 3 4 5 6 7 8 9 10, \
$(foreach f, $(filters), $(segdir)/udf$(uid)-$(f).fits) )
$(segments): $(segdir)/%.fits: $(hdegdir)/%-o.fits \
$(hdegdir)/pix-res-scale.txt $(noisechisel) $(imgwarp) \
$(arithmetic) | $(segdir)
# Generate a segmentation map in the high resolution HST image.
astnoisechisel $< -o$(@D)/$*-nc.fits
# Warp it into the MUSE pixel grid in double type, then
# multiply the pixels by the second power of the scale factor
# so they get the same original values.
sf=$$(cat $(hdegdir)/pix-res-scale.txt); \
astimgwarp $(@D)/$*-nc.fits -h1 --scale=$$sf --doubletype \
-o$(@D)/$*-wdi.fits; \
astarithmetic $(@D)/$*-wdi.fits $$sf $$sf "*" "*" -o$(@D)/$*-wd.fits
rm $(@D)/$*-wdi.fits
# Convert the warped image to long type for easy comparison,
# then remove all pixels that differ between the double and
# long type images.
astarithmetic $(@D)/$*-wd.fits 0 + --type=long -o$(@D)/$*-wl.fits
astarithmetic $(@D)/$*-wl.fits \
$(@D)/$*-wl.fits $(@D)/$*-wd.fits neq \
0 where --type=long -o$@
$(segments): $(segdir)/%.fits: $(cutdir)/%-h.fits $(onepkernel) \
$(hdegdir)/pix-res-scale.txt $(noisechisel) | $(segdir)
# Generate a segmentation map on the convolved and rescaled
# HST image.
astnoisechisel $< -o$(@D)/$*-nc.fits --kernel=$(onepkernel) \
--minbfrac=0.0 --minmodeq=0.3 --qthresh=0.4 \
--dthresh=0.8 --detsnminarea=5 --minnumfalse=50 \
--segquant=0.5 --gthresh=1e10 --objbordersn=1e10
# Make the clumps image an objects image. Note that because we
# disabled growth and object separation, each "object" only
# has one clump. So simply setting all the non-1 valued pixels
# in the objects image to zero, will do the job for us.
astarithmetic $(@D)/$*-nc.fits $(@D)/$*-nc.fits 1 neq 0 where \
-h1 -h2 -o$@ --type=long
# Clean up.
rm $(@D)/$*-nc.fits $(@D)/$*-wd.fits $(@D)/$*-wl.fits
......@@ -373,4 +373,16 @@ $(kernels): $(kerneldir)/udf%.fits: $(mpsfdir)/udf%.fits \
hst-convolved = $(foreach uid, 1 2 3 4 5 6 7 8 9 10, \
$(foreach f, $(filters), $(hdegdir)/udf$(uid)-$(f)-c.fits) )
$(hst-convolved): $(hdegdir)/%-c.fits: $(hdegdir)/%-o.fits $(kerneldir)/%.fits
# We want to subtract the Sky value, so run NoiseChisel first.
astnoisechisel $< -o$(hdegdir)/$*-nc.fits
# Subtract the Sky value.
astarithmetic $(hdegdir)/$*-nc.fits $(hdegdir)/$*-nc.fits - \
-h0 -h3 -o$(hdegdir)/$*-ss.fits
# Convolve the Sky subtracted image.
astconvolve --spatial $< --kernel=$(kerneldir)/$*.fits -o$@ -N1
# Clean up
rm $(hdegdir)/$*-nc.fits $(hdegdir)/$*-ss.fits
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment