Skip to content

Commit

Permalink
Updates for photutils (#322)
Browse files Browse the repository at this point in the history
* Update photutils imports

* Update photutils citation link

* Fix photutils 2.0 deprecation removals

* Remove photutils version pins where drizzlepac is required

* Trigger execution step to verify the updates

---------

Co-authored-by: Hatice Karatay <[email protected]>
  • Loading branch information
larrybradley and haticekaratay authored Nov 12, 2024
1 parent 31bdcb0 commit 9d24e65
Show file tree
Hide file tree
Showing 37 changed files with 92 additions and 114 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,6 @@
"- *os* for setting environment variables\n",
"- *shutil* for managing directories\n",
"- *numpy* for math and array calculations\n",
"- *collections OrderedDict* for making dictionaries easily\n",
"- *matplotlib pyplot* for plotting\n",
"- *matplotlib.colors LogNorm* for scaling images\n",
"- *astropy.io fits* for working with FITS files\n",
Expand All @@ -110,10 +109,10 @@
"import os\n",
"import shutil\n",
"import numpy as np\n",
"from collections import OrderedDict\n",
"import matplotlib.pyplot as plt\n",
"from astropy.io import fits\n",
"from photutils import datasets\n",
"from astropy.modeling.models import Gaussian2D\n",
"from photutils.datasets import make_noise_image, make_model_params, make_model_image\n",
"from astroquery.mast import Observations\n",
"from acstools import acsccd\n",
"from acstools import acscte\n",
Expand Down Expand Up @@ -298,7 +297,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"First, we generate a table of random Gaussian sources of typical brightness for our 47 Tuc field with $\\mathrm{FWHM}\\sim2.5$ pixels. Because $\\mathrm{FWHM} = 2.355\\sigma$, we will generate Gaussian sources with $\\sigma \\sim 1.06$ pixels in both $x$ and $y$. "
"First, we generate a table of random Gaussian sources of typical brightness for our 47 Tuc field with $\\mathrm{FWHM}\\sim2.5$ pixels. Because $\\mathrm{FWHM} = 2.355\\sigma$, we will generate Gaussian sources with $\\sigma \\sim 1.06$ pixels in both $x$ and $y$. We get use the shape of one of the flc image SCI extensions for creating the (x, y) coordinates of the sources."
]
},
{
Expand All @@ -307,27 +306,31 @@
"metadata": {},
"outputs": [],
"source": [
"n_sources = 300\n",
"param_ranges = [('amplitude', [500, 30000]),\n",
" ('x_mean', [0, 4095]),\n",
" ('y_mean', [0, 2047]),\n",
" ('x_stddev', [1.05, 1.07]),\n",
" ('y_stddev', [1.05, 1.07]),\n",
" ('theta', [0, np.pi])]\n",
"wfc2 = fits.getdata('jd0q14ctq_flc.fits', ext=1)\n",
"\n",
"param_ranges = OrderedDict(param_ranges)\n",
"shape = wfc2.shape"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"n_sources = 300\n",
"\n",
"sources = datasets.make_random_gaussians_table(n_sources, param_ranges, \n",
" seed=12345)\n",
"sources = make_model_params(shape, n_sources, x_name='x_mean', y_name='y_mean',\n",
" amplitude=(500, 30000), x_stddev=(1.05, 1.07), \n",
" y_stddev=(1.05, 1.07), theta=(0, np.pi), seed=12345)\n",
"\n",
"print(sources)"
"sources"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Next, we get the shape of one of the `flc` image `SCI` extensions and make an image from the table of Gaussian sources. Note that this step may take a few minutes. Finally, we run the synthetic image through a Poisson sampler in order to simulate the Poisson noise of the scene."
"Next, we make an image from the table of Gaussian sources. Finally, we run the synthetic image through a Poisson sampler in order to simulate the Poisson noise of the scene."
]
},
{
Expand All @@ -336,11 +339,9 @@
"metadata": {},
"outputs": [],
"source": [
"wfc2 = fits.getdata('jd0q14ctq_flc.fits', ext=1)\n",
"\n",
"shape = wfc2.shape\n",
"\n",
"synth_stars_image = datasets.make_gaussian_sources_image(shape, sources)\n",
"model = Gaussian2D()\n",
"synth_stars_image = make_model_image(shape, model, sources, \n",
" x_name='x_mean', y_name='y_mean', progress_bar=True)\n",
"\n",
"synth_stars_image = np.random.poisson(synth_stars_image)"
]
Expand Down Expand Up @@ -382,7 +383,7 @@
"ax.imshow(flc, vmin=0, vmax=200, interpolation='nearest', cmap='Greys_r', origin='lower')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand Down Expand Up @@ -442,7 +443,7 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand Down Expand Up @@ -722,9 +723,7 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": true
},
"metadata": {},
"outputs": [],
"source": [
"flt_stars = fits.getdata('jd0q14ctq_stars_ctefmod_flt.fits', ext=1)\n",
Expand All @@ -737,15 +736,13 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": true
},
"metadata": {},
"outputs": [],
"source": [
"flc_stars = fits.getdata('jd0q14ctq_stars_ctefmod_flc.fits', ext=1)\n",
Expand All @@ -758,7 +755,7 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand Down Expand Up @@ -829,7 +826,7 @@
"metadata": {},
"outputs": [],
"source": [
"noise_image = datasets.make_noise_image(shape, distribution='poisson', mean=40, seed=12345)\n",
"noise_image = make_noise_image(shape, distribution='poisson', mean=40, seed=12345)\n",
"\n",
"wfc1 += noise_image + synth_stars_image\n",
"wfc2 += noise_image + synth_stars_image\n",
Expand Down Expand Up @@ -860,7 +857,7 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand Down Expand Up @@ -986,14 +983,14 @@
"rn_C = hdr['READNSEC']\n",
"rn_D = hdr['READNSED']\n",
"\n",
"img_rn_A = datasets.make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_A)\n",
"img_rn_B = datasets.make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_B)\n",
"img_rn_C = datasets.make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_C)\n",
"img_rn_D = datasets.make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_D)\n",
"img_rn_A = make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_A)\n",
"img_rn_B = make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_B)\n",
"img_rn_C = make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_C)\n",
"img_rn_D = make_noise_image((shape[0], int(shape[1]/2)), distribution='gaussian', \n",
" mean=0., stddev=rn_D)\n",
"\n",
"wfc1_rn = np.hstack((img_rn_A, img_rn_B))\n",
"wfc2_rn = np.hstack((img_rn_C, img_rn_D))"
Expand Down Expand Up @@ -1290,7 +1287,7 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand All @@ -1309,7 +1306,7 @@
" markerfacecolor='none', markeredgecolor='red', linestyle='none')\n",
"\n",
"ax.set_xlim(2000, 2800)\n",
"ax.set_ylim(1200, 1700)"
"ax.set_ylim(800, 1300)"
]
},
{
Expand Down Expand Up @@ -1365,7 +1362,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.12"
"version": "3.12.7"
}
},
"nbformat": 4,
Expand Down
4 changes: 2 additions & 2 deletions notebooks/ACS/acs_cte_forward_model/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@ astropy>=5.3.3
astroquery>=0.4.6
matplotlib>=3.7.0
numpy>=1.23.4
photutils>=1.6.0
photutils>=2.0.2
crds>=11.17.7
stsci.tools>=4.1.0
stsci.tools>=4.1.0
Original file line number Diff line number Diff line change
Expand Up @@ -762,7 +762,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.7"
"version": "3.12.7"
}
},
"nbformat": 4,
Expand Down
1 change: 0 additions & 1 deletion notebooks/ACS/acs_sbc_dark_analysis/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,3 @@ astroquery>=0.4.6
drizzlepac>=3.5.1
matplotlib>=3.7.0
numpy>=1.23.4
photutils>=1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
4 changes: 2 additions & 2 deletions notebooks/DrizzlePac/align_mosaics/align_mosaics.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"\n",
"## Learning Goals\n",
"\n",
"By the end of this notebook tutorial, you will:\n",
"By the end of this notebook tutorial, you will: \n",
"\n",
"- Download WFC3 UVIS & IR images with `astroquery`\n",
"- Check the active WCS (world coordinate system) solution in the FITS images\n",
Expand Down Expand Up @@ -1244,7 +1244,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.4"
"version": "3.12.5"
}
},
"nbformat": 4,
Expand Down
1 change: 0 additions & 1 deletion notebooks/DrizzlePac/align_mosaics/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,3 @@ drizzlepac>=3.5.1
ipython>=8.11.0
matplotlib>=3.7.0
numpy>=1.23.4
photutils==1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,3 @@ ipython
matplotlib
numpy
jupyter
photutils==1.12.0
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@
"## Introduction <a id=\"intro\"></a>\n",
"[Table of Contents](#toc)\n",
"\n",
"This notebook demonstrates aligning long exposures which have relatively few stars and a large number of cosmic rays. It is based on the example described in the ISR linked here ([ACS ISR 2015-04: Basic Use of SExtractor Catalogs With TweakReg - I](https://ui.adsabs.harvard.edu/abs/2015acs..rept....4L/abstract)), but uses a much simpler methodology.\n",
"This notebook demonstrates aligning long exposures which have relatively few stars and a large number of cosmic rays. It is based on the example described in the ISR linked here ([ACS ISR 2015-04: Basic Use of SExtractor Catalogs With TweakReg - I](https://ui.adsabs.harvard.edu/abs/2015acs..rept....4L/abstract)), but uses a much simpler methodology. \n",
"\n",
"Rather than making use of external software (e.g. [SExtractor](http://www.astromatic.net/software/sextractor)) and going through the extra steps to create 'cosmic-ray cleaned' images for each visit, this notebook demonstrates new features in `TweakReg` designed to mitigate false detections.\n",
"\n",
Expand Down Expand Up @@ -1013,7 +1013,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.6"
"version": "3.12.5"
}
},
"nbformat": 4,
Expand Down
3 changes: 1 addition & 2 deletions notebooks/DrizzlePac/align_sparse_fields/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,4 @@ drizzlepac>=3.6.2
ipython>=8.21.0
matplotlib>=3.8.2
numpy>=1.26.3
photutils==1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
jupyter>=1.0.0
jupyter>=1.0.0
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@
"source": [
"<a id=\"intro\"></a>\n",
"## Introduction\n",
"The alignment of HST exposures is a critical step in image stacking or combination performed with software such as `AstroDrizzle`. Generally, a *relative* alignment is performed to align one or multiple images to another image that is designated as the reference image. The reference image is generally the deepest exposure and/or that covering the largest area of all the exposures. This process aligns the images to each other, but the pointing error of the observatory can still cause the images to have incorrect *absolute* astrometry. When absolute astrometry is desired, the images can be aligned to an external catalog with an absolute world coordinate system (WCS). In this example, we will provide a workflow to query catalogs such as SDSS and Gaia using the astroquery package, and then align the images to that catalog using TweakReg. \n",
"The alignment of HST exposures is a critical step in image stacking or combination performed with software such as `AstroDrizzle`. Generally, a *relative* alignment is performed to align one or multiple images to another image that is designated as the reference image. The reference image is generally the deepest exposure and/or that covering the largest area of all the exposures. This process aligns the images to each other, but the pointing error of the observatory can still cause the images to have incorrect *absolute* astrometry. When absolute astrometry is desired, the images can be aligned to an external catalog with an absolute world coordinate system (WCS). In this example, we will provide a workflow to query catalogs such as SDSS and Gaia using the astroquery package, and then align the images to that catalog using TweakReg.\n",
"\n",
"The workflow in this notebook for aligning images to [Gaia](https://www.cosmos.esa.int/web/gaia/home) is based on [WFC3 ISR 2017-19: Aligning HST Images to Gaia: a Faster Mosaicking Workflow](https://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/wfc3/documentation/instrument-science-reports-isrs/_documents/2017/WFC3-2017-19.pdf) and contains a subset of the information and code found in [this repository](https://github.com/spacetelescope/gaia_alignment). For more information, see the notebook in that repository titled [Gaia_alignment.ipynb](https://github.com/spacetelescope/gaia_alignment/blob/master/Gaia_alignment.ipynb).\n",
"\n",
Expand Down Expand Up @@ -965,7 +965,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.6"
"version": "3.12.5"
}
},
"nbformat": 4,
Expand Down
3 changes: 1 addition & 2 deletions notebooks/DrizzlePac/align_to_catalogs/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,4 @@ drizzlepac>=3.6.2
ipython>=8.21.0
matplotlib>=3.8.2
numpy>=1.26.3
photutils==1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
jupyter>=1.0.0
jupyter>=1.0.0
4 changes: 2 additions & 2 deletions notebooks/DrizzlePac/drizzle_wfpc2/drizzle_wfpc2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@
"source": [
"## Introduction <a id=\"intro\"></a>\n",
"\n",
"Extra care must be taken when using `AstroDrizzle` to combine observations from detectors comprised of multiple chips of varying sensitivity. `AstroDrizzle` works with calibrated images in units of counts (electrons or Data Numbers) or count rates and not in units of flux. It assumes that all input frames can be converted to physical flux units using a single inverse-sensitivity factor, recorded in the FITS image headers as `PHOTFLAM`, and the output drizzled product simply copies the `PHOTFLAM` keyword value from the first input image. When this occurs, the inverse-sensitivity will vary across the final drizzled product, and users will need to keep track of which sources fell on which chip when doing photometry. Moreover, varying detector sensitivities will affect the cosmic-ray rejection algorithm used by `AstroDrizzle`, and this may result in the misidentification of some good pixels as cosmic rays.\n",
"Extra care must be taken when using `AstroDrizzle` to combine observations from detectors comprised of multiple chips of varying sensitivity. `AstroDrizzle` works with calibrated images in units of counts (electrons or Data Numbers) or count rates and not in units of flux. It assumes that all input frames can be converted to physical flux units using a single inverse-sensitivity factor, recorded in the FITS image headers as `PHOTFLAM`, and the output drizzled product simply copies the `PHOTFLAM` keyword value from the first input image. When this occurs, the inverse-sensitivity will vary across the final drizzled product, and users will need to keep track of which sources fell on which chip when doing photometry. Moreover, varying detector sensitivities will affect the cosmic-ray rejection algorithm used by `AstroDrizzle`, and this may result in the misidentification of some good pixels as cosmic rays. \n",
"\n",
"This is a typical situation when drizzle-combining images from HST instruments with different chip sensitivities, e.g. Wide Field and Planetary Camera 2 (WFPC2). For more detail, see the section on [Gain Variation](http://www.stsci.edu/instruments/wfpc2/Wfpc2_dhb/wfpc2_ch53.html) under 'Position-Dependent Photometric Corrections' in the WFPC2 Data Handbook. As a result, each of the four chips requires a [unique PHOTFLAM](http://www.stsci.edu/instruments/wfpc2/Wfpc2_dhb/wfpc2_ch52.html#1933986) header keyword value. A similar situation may occur when drizzle-combining observations taken over a span of several years as detector's sensitivity declines over time, see e.g. [ACS ISR 2016-03](https://doi.org/10.3847/0004-6256/152/3/60).\n",
"\n",
Expand Down Expand Up @@ -530,7 +530,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.4"
"version": "3.12.5"
},
"varInspector": {
"cols": {
Expand Down
3 changes: 1 addition & 2 deletions notebooks/DrizzlePac/drizzle_wfpc2/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,10 @@ astroquery>=0.4.6
drizzlepac>=3.5.1
matplotlib>=3.7.0
numpy>=1.23.4
photutils==1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
stsci.image>=2.3.5
stsci.imagestats>=1.6.3
stsci.skypac>=1.0.9
stsci.stimage>=0.2.6
stsci.tools>=4.0.1
stwcs>=1.7.2
crds
crds
4 changes: 2 additions & 2 deletions notebooks/DrizzlePac/mask_satellite/mask_satellite.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
"## Introduction <a id=\"intro_ID\"></a>\n",
"[Table of Contents](#toc)\n",
"\n",
"Even though Hubble has a small field of view, satellites are commonly captured in images. The cosmic ray rejection algorithm in Astrodrizzle is not well suited to eliminate satellite trails, and the affected adjacent pixels that make up their wings leave ugly blemishes in stacked images. \n",
"Even though Hubble has a small field of view, satellites are commonly captured in images. The cosmic ray rejection algorithm in Astrodrizzle is not well suited to eliminate satellite trails, and the affected adjacent pixels that make up their wings leave ugly blemishes in stacked images.\n",
"\n",
"To fix this problem, the pixels around satellite trails need to be marked as bad in the affected images. There are several ways to accomplish this goal. The ACS Team developed multiple algorithms to automatically detect and mask satellite trails. The newest is a module called `findsat_mrt` and is decribed in [ISR ACS 2022-08](https://www.stsci.edu/files/live/sites/www/files/home/hst/instrumentation/acs/documentation/instrument-science-reports-isrs/_documents/isr2208.pdf). The 'readthedocs' page can be found here: [MRT-based Satellite Trail Detection](https://acstools.readthedocs.io/en/latest/findsat_mrt.html). The second module is called `satdet` and is described in [ISR ACS 2016-01](http://www.stsci.edu/hst/acs/documents/isrs/isr1601.pdf). The 'readthedocs' page for the software can be found here: [Satellite Trails Detection](https://acstools.readthedocs.io/en/stable/satdet.html). `findsat_mrt` has the benefit of significantly improved sensitivity over `satdet` but it is more computationally demanding. \n",
"\n",
Expand Down Expand Up @@ -1158,7 +1158,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.6"
"version": "3.12.5"
}
},
"nbformat": 4,
Expand Down
1 change: 0 additions & 1 deletion notebooks/DrizzlePac/mask_satellite/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ crds>=11.17.15
drizzlepac>=3.6.2
ipython>=8.21.0
matplotlib>=3.8.2
photutils==1.12.0 # The drizzlepac needs deprecated methods such as DAOGroup.
jupyter>=1.0.0
acstools>=3.7.1
scikit-image>=0.20.0
Loading

0 comments on commit 9d24e65

Please sign in to comment.