=Paper=
{{Paper
|id=Vol-2893/paper_2
|storemode=property
|title=Removal of Complex Image Distortions via Solving Integral Equations Using the "Spectral Method"
|pdfUrl=https://ceur-ws.org/Vol-2893/paper_2.pdf
|volume=Vol-2893
|authors=Valery Sizikov,Polina Loseva,Egor Medvedev,Daniil Sharifullin,Aleksandra Dovgan,Nina Rushchenko
|dblpUrl=https://dblp.org/rec/conf/micsecs/SizikovLMSDR20
}}
==Removal of Complex Image Distortions via Solving Integral Equations Using the "Spectral Method"==
Removal of Complex Image Distortions via Solving Integral Equa-
tions using the "Spectral Method"
Valery Sizikova, Polina Losevaa, Egor Medvedeva, Daniil Sharifullina, Aleksandra Dovgana,
Nina Rushchenkoa
a
ITMO University, Kronverksky pr., 49, Saint-Petersburg, 197101, Russia
Abstract
In the paper, the mathematical removal problem of distortions (smearing, defocusing, noising)
of object images is considered. The type of distortion (smear or defocus) is determined with
the help of the developed "spectral method (rule)", and the values of the distortion parameters
are estimated according to the derived original formulas. On the example of optical images of
the Black Sea, the problem is considered to determine the distortion types with the subsequent
estimation of the distortion parameters. In addition, on the example of people’s photographs,
the complex case is first considered when the image is simultaneously noisy, smeared, and
defocused ("triple distortion"). Only after that, the distortions are removed by solving integral
equations. This allows reducing the error in the image restoration by solving integral equations
in the inverse (ill-posed) problem. The proposed image processing technologies can enhance
the resolution of optical devices (video cameras, tracking devices, and others). The results of
processing the images of the Black Sea and removal of triple distortion of kids’ images with
previously unknown types and parameters of distortion are presented.
Keywords 1
spectral method for determining the type and parameters of distortions, complex distortion of
optical images (smear, defocus, noising), point spread function (PSF), integral equations, dis-
tortion removal, MatLab
1. Introduction
One of the actual problems of optics is to obtain clear images of various objects: people, cars, the
Earth's surface, etc. with the help of optical devices for image recording (ODIR) – cameras, shooting
cameras, tracking systems, tomographs, telescopes, microscopes, and others. Clear images provide rich
information about objects and processes. ODIRs can be installed, for example, on a production con-
veyor with details, on satellites, airplanes, unmanned flying apparatus and can monitor the physical
environments with remote access using various automatic devices [1, p. 39–47].
However, the obtained images often have distortions: smear due to the movement of the object dur-
ing the exposure, defocus due to the improper setting of the ODIR focus, noising by external (atmos-
pheric) and internal (instrumental) noise, and others. Distortions can be largely removed by mathemat-
ical and computer image processing. For this purpose, a number of processing methods have been de-
veloped – image restoration via solving integral equations by stable methods, removing noise from
images by filtering methods, etc. [1–6]. However, the methods for determination of the type and pa-
rameters of distortion, as well as simultaneous smear, defocus, and noising of an image (complex dis-
tortion) are not well developed.
Proceedings of the 12th Majorov International Conference on Software Engineering and Computer Systems,
December 10-11, 2020, Online & Saint Petersburg, Russia
EMAIL: sizikov2000@mail.ru (V. Sizikov); poloska97pl@gmail.com (P. Loseva); egor9721@gmail.com (E. Medvedev); dmshari-
fullin@gmail.com (D. Sharifullin); aleksandra-dv@yandex.ru (A. Dovgan); rushchenko@itmo.ru (N. Rushchenko)
ORCID: 0000-0002-4618-8753 (V. Sizikov); 0000-0003-2301-167X (P. Loseva); 0000-0003-4255-3095 (E. Medvedev);
0000-0002-7662-2493 (D. Sharifullin); 0000-0002-9971-8753 (A. Dovgan); 0000-0003-1230-5410 (N. Rushchenko)
© 2020 Copyright for this paper by its authors.
Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR Workshop Proceedings (CEUR-WS.org)
In this paper, the further development of the spectral method (rule) for determining the type and
parameters of a distortion is given. Moreover, the triple image distortion is considered for the first time.
Figure 1 presents three images (photographs) 434×700 pixels of the Black Sea with different PSFs
(point spread functions). This situation can occur when a survey of the Earth's surface is performed
from near space so that details about the size of a person are visible. However, such details are not
visible due to smear, defocus and noise. Mathematical processing of images is required.
Figure 1: Distorted (smeared, defocused, noisy) images of the Black Sea obtained from a satellite
From the images, it is almost impossible to determine the type of distortion – which image is smeared
and which one is defocused, as well as to determine the presence of noise and the type of noise (impul-
sive, Gaussian, etc.). In addition, an important task is to estimate the distortion parameters: the value of
Δ and the angle θ of smear in the case of image smearing, as well as the size of the defocusing spot ρ
or σ in the case of image defocusing. The photographs in Figure 1 were specially selected with weak
(visually imperceptible) distortions to show that it is impossible to visually determine the types and
parameters of distortions. However, this can be done by mathematical and computer processing of im-
ages using a spectral method.
The aim of this work is to show that it is possible to determine the type and parameters of image
distortions (i.e., to estimate PSF) based on the Fourier transform of a distorted image (by the spectral
method). This can be done even in the case of complex (for example, triple) distortion; and then we can
restore the image by solving integral equations with using the estimated PSF, which plays the role of
the integral equation kernel.
2. A list and comparison of methods for determining the type
and parameters of distortion
In [2, 3], [4, p. 213–220], [5], the spectral method was proposed to determine the distortion type and
the values of its parameters based on the Fourier transform (Fourier spectrum) of the distorted image.
Papers [2–5] were preceded by papers [6–8], where the Fourier spectra of distorted images were also
used, but formulas for the distortion parameters were not derived.
This paper provides a further development of the spectral method for determining the type and pa-
rameters of distortion in relation to conventional and triple distortions.
There are the following methods to determine the type of distortion (smear or defocus) and ways to
estimate the values of its parameters:
– A method for estimating the parameters and based on the strokes in the image at the rectilinear
and uniform smear [1, p. 387, 394], [9, p. 100], [10]. For this method to be effective, it is necessary that
the object has at least one clear point, which is transformed into a stroke with the parameters and
on the smeared image. However, such a point is not often presented on the object.
– A method for evaluating the point spread function (PSF) when defocusing [1, 9, 10]. In this case,
the point on the object is transformed into a diffusion circle (spot) [8, p. 100] on the defocused image
and this spot is the PSF. However, this (bright) point is usually not presented in the object and image.
– Methods of "blind" [1], [4, p. 133], [11, p. 193], [12–14] and "semi-blind" [15] deconvolution,
where the PSF is determined, the true image (with regularization) is later computed, and then the cal-
culation of this pair is iteratively repeated. In these methods, the functional is minimized with re-
strictions on the solution, a good initial approximation and the number of iterations are required, etc.
These are actively developed methods, but they do not guarantee, for example, convergence to an image
close to the true one.
We see that the listed above methods either have a lack of information or are difficult to be imple-
mented, and an error in determining, for example, the smear angle of only 1–2 degrees can lead to a
significant error in the image reconstructed [2, 3], [4, p. 213], [5] even by a stable method (Tikhonov
regularization, Wiener filtering, etc.) due to the incorrectness of the image restoration problem [1, 4, 6,
16–18].
Let us also refer to the works [19–22] devoted to the PSF determining methods.
In this work, in order to determine the distortion type (smear or defocus) and its parameters, we use
a modification of the spectral method (rule), new in comparison with [2–5], focused on weak (visually
imperceptible) distortion, as well as not only on simple, but also on triple distortions.
3. Using the spectral method to determine the image distortion type
Let g(x,y) be the intensity of the distorted image (e.g., photograph), where x is the horizontal coor-
dinate on the image, and y is the vertical coordinate directed downward.
Let’s perform a two-dimensional Fourier transform (FT) of distorted image g(x,y):
i (x x + y y )
G(x , y ) = g ( x, y) e dx dy , (1)
− −
where ωx and ωy are Fourier frequencies directed horizontally and vertically, respectively. We assume
that the Fourier spectrum (1) of the image is calculated through the Discrete/Fast Fourier Transform
(DFT/FFT), for example, via m-function fftshift.m with centering within the MatLab system [11, 23].
We obtain the complex Fourier spectrum G(ωx, ωy), which we derive as the module of the spectrum
|G(ωx, ωy)|. Figure 2 presents the modules of the Fourier spectra |G(ωx, ωy)| of three images g(x,y) shown
in Figure 1. The spectra in Figure 2 are significantly different and this helps to define the distortion
types. The spectra can be used to estimate the parameters of the distortion by the spectral method (rule).
Figure 2: Modules of spectra |G(x,y)| of distorted images g(x,y), presented in Figure 1
The theory and practice of spectral processing of distorted images using the FT [2–9, 24] clearly
show the following criterions:
• Fourier spectrum of a smeared image has the form of almost parallel lines with an angle of incli-
nation depending on the smear angle ;
• Fourier spectrum of a defocused image in the case when the PSF is a homogeneous disk is a set
of ellipses; and in the case when the PSF is the Gaussian, the spectrum is also the two-dimensional
Gaussian.
As a result, we clearly determine by the form of the spectra in Figure 2 that Figure 1a presents a
smeared image, Figure 1b – defocused image when PSF is a homogeneous disk, and Figure 1c – defo-
cused image when PSF is the Gaussian.
Let us estimate the parameters of smearing by the spectral method [2–5].
3.1. Estimates of smear parameters
On the spectrum of the smeared image (Figure 2d), we draw an axis between the central parallel
lines, and also perpendicularly – the ω axis, horizontally – the ωx axis, and vertically downwards – the
ωy axis. Then the smear angle of the image , as shown in [3, 5], is determined by the formula: = 90º
– , where
tan
= arctan , (2)
r
and and are true angles; = 90 − , where is the measured angle between the horizontal and
the ω axis. Here r = M N , where M is the number of rows, and N is the number of columns in the
image g. The difference from is due to the difference r from 1 (if an image is square, then = ).
According to Figure 2d, we made several geometrical measurements of the angle . On average, we
obtain the following values: = 61.47 and = 90 − = 28.53 . The values M and N: M = 434,
N = 700, that is why r = M /N = 0.620. Using formula (2), we find on average: ψ = 41º.24 ± 0.35 and θ
= 48º.75 ± 0.40, which is close to the exact value of the smear angle θ = 49º.
To estimate the parameter Δ, we mark on the ωx axis in Figure 2d values of frequencies ω1 and ωmax
– the first and the last zeros of the function |G(x,y)|. Parameter Δ is equal to [5]
= 2 max . (3)
1
Based on several measurements of the dimensionless ratio ωmax / ω1 and formula (3), we obtain on
average the following value: Δ = 21.4 ± 0.3, which is close to the exact value of smear Δ = 21 pixels.
We can see that the spectral method estimates rather accurately the parameters of image smear in
Figure 1a. The image reconstruction using the found parameters θ and Δ is presented below.
3.2. Estimates of defocus parameter (2 variants)
We determine by the form of the spectra in Figures 2b and 2c that the defocused images are presented
in Figures 1b and 1c. Let us estimate the defocus parameters in Figures 1b and 1c by the spectral method
(rule) [2–5].
Variant 1. The Fourier spectrum |G(x,y)| in Figure 2b in the form of ellipses indicates that the
image in Figure 1b is defocused, and the PSF is a homogeneous disk of a certain radius . In optics,
this corresponds to the transmission of light through a thin lens [8, p. 100], [25, p. 264]. In this case,
the spectrum |G(x,y)| for r ≠ 1 is a set of ellipses (see Figures 2b and 2e in the pixel region). For r = 1,
as well as in the Nyquist frequency range, it is a set of circles [8], since the maximum Nyquist frequen-
cies along the ωx и ωy axes are equal max = x = y = (since Δx = Δy = 1 pixel).
The image spectrum G(x,y) is proportional to the transfer function of the optical system H(x,y),
and [5, 8, 26]
2
H (x , y ) = H () = J1 () , = 2x + 2y ,
where J1 is a first-order Bessel function of the first kind [3–5, 8]. The zeros of the Bessel function J1(ρ)
are
= 0, 3.84, 7.02, 10.16, 13.32, (4)
These zeros correspond to the ellipses’ semiaxes in Figure 2e. We have from (4):
= 3.84 x1 , 7.02 x 2 , 10.16 x 3 , 13.32 x4 , , (5)
where x 1 , … are values of frequencies ωx corresponding to each zero.
When discretization, ωmax = π is the Nyquist frequency along both the horizontal and the vertical
axes in Figure 2e. Then the frequencies x i , i = 1, 2, 3, 4, … in units of ωmax (but not in px) are equal
x i = x i rel x max = (x i x max ) , i = 1, 2, 3, 4, . (6)
We measure the ratio ω1/ωmax in Figure 2e, calculate the frequency 1 = (1 max ) (see (6)),
obtain in Nyquist frequencies: ω1 = 0.553, calculate the defocusing parameter ρ according to (5):
ρ = 3.84/ω1 and obtain finally: ρ = 6.94 ± 0.2, which is close to the exact value ρ = 7 pixels.
The image reconstruction using the found parameter ρ is given below.
Variant 2. If during defocusing, each point of the object turns into a two-dimensional Gaussian in
the distorted image, the PSF will also be Gaussian (see Figures 2c and 2f):
1 r2
h( r ) = exp − 2 , r = x2 + y 2 , (7)
22r 2r
where σr is the standard deviation (SD) of the PSF-Gaussian. The Fourier spectrum H of such the PSF
and the spectrum G(x,y) of the defocused image will also take the Gaussian form:
2 2
G( x , y ) = G() ~ H () ~ exp − r2 ~ exp − 2 , = 2x + 2y . (8)
2 2
The spectrum G becomes to be real in the spot form with a monotonic decrease of brightness from
the center (Figures 2c and 2f). Note that in Variant 1 (Figures 2b and 2e), there are zeros in the spectrum
G and this contributes to the determination of the parameter ρ, while in Variant 2 (Figures 2c and 2f),
there are no such zeros. However, the Gaussian (8) decreases rapidly and practically vanishes at ω x ≈
3σω. Therefore, we propose the "three-sigma" rule. As follows from (8), σr = 1/σω. Thus, according to
the "three sigma" rule,
3
r = . (9)
3
Let us consider that ωmax = π. We estimate from Figure 2f value 3σω ≈ 1–1.15. Then, using (9), we
obtain on average over several measurements: σr = 2.8 ± 0.3, which is close to the exact value σr = 3
pixels. In this variant, due to the indistinctness of the boundary (where G ≈ 0), the error in determining
σr became to be rather large, namely, (0.3/2.8) · 100% ≈ 10.7%. Nevertheless, the use of several meas-
urements of the value 3σω allowed bringing the average value σr closer to the exact one.
We see that the spectral method (rule) allows estimating the defocusing image parameters with an
acceptable error.
3.3. Removal of image smearing and defocusing
After determining the distortion type (smear or defocus) by the spectral method and estimating the
distortion parameters, we solve the problem of stable removal of image distortions by mathematical and
software means.
Removal of image smearing. In case of smearing, the x-axis is directed along the smear, and the y-
axis is perpendicular to it. The inverse problem of removing uniform and rectilinear image smear is
reduced to solving a one-dimensional Fredholm integral equation (IE) of the first kind of convolution
type for each value of y playing the role of a parameter [27]:
− h( x − ) w y () d = g y ( x) , (10)
where w is the true (undistorted, desired) image, g is the smeared (measured) image, h is the kernel of
the IE (or PSF), equal to [4, p. 111]
1 , − x 0,
h( x ) = (11)
0, otherwise.
A stable solution of IE (10) by the Tikhonov regularization method (TR) and the Fourier transform
(FT) has the form [27]:
1 H (−) G y () −i
w y () =
2 − | H () |2 + 2 p
e d , (12)
where H(ω) and Gy(ω) are one-dimensional FTs of the functions h(x) and gy(x), α > 0 is the regulariza-
tion parameter, p ≥ 0 is the regularization order (usually p = 1 or 2).
Figure 3a presents the result of removing smear in the image shown in Figure 1a. The set of IEs (10)
was solved by the TR–FT method according to (12) and (11) using the developed m-function desmear-
ingf.m at = 10–4, p = 2 (the values of and p were chosen by matching).
Figure 3. Removal of smear/defocus. (a) image in Figure 1a after removing the smear; (b) image in
Figure 1b after removing the defocus for PSF (14); (c) image in Figure 1c after removing the defocus
for PSF (15)
In this case, the amount of smear Δ and its direction θ were estimated by the spectral method (rule)
described above: Δ = 21, θ = 49º (Figure 1a). We can see that the smear is removed2.
Removal of defocusing. In this case, it is necessary to solve a two-dimensional IE of convolution
type:
− h( x − , y − ) w(, ) d d = g ( x, y) (13)
by the TR–FT method [27]. Moreover, if the PSF is a uniform disk of radius ρ, then the kernel of the
IE (PSF) is equal to
1 2 , x 2 + y 2 ,
h ( x, y ) = (14)
0, otherwise,
and if the PSF is the Gaussian, then
1
e −( x + y ) 2 r .
2 2 2
h ( x, y ) = (15)
2r2
The solution of IE (13) by the TR method and two-dimensional FT is equal to (cf. (12)) [27]
1 H * (1, 2 ) G(1, 2 )
w (, ) =
2
4 − − | H (1, 2 ) | + (1 + 2 )
2 2 2 p
e−i (1+2) d 1 d 2 . (16)
The following algorithm describes the solution to the inverse problem of image restoration.
Algorithm. Restoration of a smeared or defocused image
Input: distorted image g(x, y)
(1) Calculating the Fourier spectrum G(1, 2 ) = F ( g ( x, y)) , where F is the FT
(2) Determination of the distortion type by the spectrum (spectral method) – smear or defocus
(3) Calculating the smear value Δ and angle θ, as well as defocus parameter ρ or σr
(4) Constructing the PSF h(x, y) as (11) in case of a smear, and (14) or (15) in case of defocus
(5) Calculating the Fourier spectrums H(ω) and Gy(ω) on smearing or H(ω1,ω2) or G(ω1,ω2) on defocusing
(6) Choosing the regularization parameter and order p in some way
2
IE (10) was also solved by the Wiener parametric filtering method, by the Lucy–Richardson maximum likelihood algorithm,
as well as by the TR and quadrature method [1, 6, 9]. The solutions are close, except for the solution by the Lucy–Richardson
algorithm (this algorithm is discussed in [4, p. 133]).
(7) Calculation of the reconstructed image by the TR method wy(ξ) according to (12) or w(ξ,η) according
to (16)
Output: the restored image wy(ξ) in case of a smear or w(ξ,η) in case of defocus
The parameters ρ in (14) and σr in (15) are determined by the spectral method according to (5) and
(9), respectively. After that, we solve the two-dimensional IE (13) by the TR–FT method according to
(16) using the developed m-function refocusingT.m. Figure 3b demonstrates the result of image recon-
struction by solving IE (13) with PSF (14) at ρ = 7 pixels, found by the spectral method according to
(5) ( = 1.2·10–5). Figure 3c presents the reconstructed image by solving the IE (13) with PSF (15) at
σr = 3 pixels, found by the spectral method according to (5) ( = 0.4·10–5).
Figure 3 shows that smear and defocus were removed reliably enough. Furthermore, the noise was
also restored, and it can be seen that this is bipolar impulse noise [27]. Next, we perform the following
operation:
Removal of noise. Figure 4 presents the result of filtering the impulse noise by the median Tukey
filter (as shown in [10], impulse noise is best filtered by the median filter).
Figure 4. Results of removing the bipolar impulse noise by the Tukey median filter with a sliding win-
dow [3 3]
Figure 4 demonstrates a satisfactory result: the types and parameters of image distortions were de-
termined by the spectral method (see Figures 1 and 2), which allowed us to restore the distorted images
with increased accuracy by solving the integral equations (Figures 3 and 4).
Error estimation. In order to estimate a particular image processing operation not only qualita-
tively, but also quantitatively, we propose the following formula for the relative error in the image
processing [4, p. 239]:
|| w − w ||L2 M N M N
( wji − w ji ) w2ji ,
2
rel () = = (17)
|| w ||L2 j =1 i =1 j =1 i =1
where w is the calculated image, and w is the exact one. The formula (17) as applied to images is
more convenient and descriptive than the well-known formula for the PSNR error, which is widely used
in radio engineering, acoustics, and other spheres. The PSNR formula uses log intensities and is effec-
tive when the intensity range is large. If the range of intensities is small, then it is advisable to use the
intensities directly, as in the formula (17).
Note that the formula (17) can be applied for the case when w is known, i.e. when solving model
(not real) examples. Therefore, the formula (17) cannot be used for the analysis of the Black Sea images,
and should be limited to the visual analysis. However, the formula (17) can be applied to the following model
example of the triple distortion.
4. Triple distortion case
Consider the case of complex distortion, when the image is smeared, defocused, and noised simul-
taneously. Let us call this the triple distortion. This case has not yet been practically considered by other
authors. More specifically, let a stationary object be photographed in a noisy environment. At the same
time, the camera moved and, moreover, the focus was incorrectly set in it. Figure 5 presents such an
example.
Figure 5. Images of kids: (a) noised (rel = 0.120) , (b) smeared (rel = 0.148) , (c) defocused (variant
1, rel = 0.164 ); (d) after the defocus removal (rel = 0.148) , (e) after the smear removal
(rel = 0.131) , (f) after the noise removal (rel = 0.104)
In this example, the image is noisy with 1% impulse noise (the noise density d = 0.01, Figure 5a),
smeared by Δ = 24 pixels at an angle of θ = 72° (Figure 5b) and defocused (ρ = 10, Figure 5c). In the
direct problem, all operations of noising, smearing, and defocusing are linear operations. Therefore,
their total image (Figure 5c) does not depend on the order of performing distortion operations.
We assume that we have only the summary image in Figure 5c. To determine the types of distortions
in Figure 5c, we derived the modulus of the Fourier-spectrum |G(x,y)| in Figure 6.
Figure 6 shows that the total spectrum displays the spectra of smearing (parallel lines) and defocus-
ing (ellipses), although not as detailed as in Figures 2a and 2b. Nevertheless, the main elements of the
two spectra are presented. In Figure 6b, we draw the straight lines similar to those in Figures 2d and 2e.
This allows us to measure the parameters and , as well as the ratio ωmax / ω1 (see Figure 6b). Then,
we estimate ψ according to (2), θ = 90° – ψ ≈ 72° and Δ ≈ 24 pixels according to (3) (the smear param-
eters). Next, we measure the ratio x1 x max in Figure 6b, calculate the Nyquist frequency
x1 = (x1 x max ) (see (6)), calculate the defocus parameter ρ according to (5): = 3.84 x1 and
obtain ρ ≈ 10 pixels.
Using the estimated parameter ρ, we remove the defocus in Figure 5c by the TR–FT method accord-
ing to (13), (14), (16) at = 3·10–3 and obtain Figure 5d. After that, using the estimated parameters θ
and Δ, we remove the smear in Figure 5d by the TR–FT method according to (10)–(12) at = 1.5·10–3
and obtain Figure 5d.
Finally, we remove the impulse noise in Figure 5e by the median Tukey filter [11, 23, 28] and obtain
Figure 5f.
Figure 6. The total Fourier spectrum |G(x,y)| of image that simultaneously noisy, smeared and defo-
cused shown in Figure 5c. (a) raw spectrum, (b) processed spectrum
In the inverse problem, the operations of defocus and smear removing by solving linear integral
equations are linear ones, while noise removing by the median filter is a nonlinear operation. Therefore,
the final result in Figure 5f is not clear enough.
Note that the relative error σrel increases monotonically from Figure 5a to Figure 5c as distortions
accumulate. Then this error decreases monotonically from Figure 5c to Figure 5f as the distortions are
removed, as it should be for the complex distortion.
It can be concluded that in the case of triple distortion, the results of sequential image processing
generally repeat the results of processing simple distortions, but in less details (compare Figure 6 with
Figure 2). In addition, the noise in Figure 5e is not enough restored as it should have been and noise in
Figure 5f is not completely eliminated. In addition, the noise in Figure 5e is restored insufficiently and
the noise in Figure 5f is removed insufficiently.
5. Conclusion
1. The problem of mathematical removing the image distortions (smearing, defocusing, and noising)
is considered for the examples of the Black Sea and kids’ images. The types and parameters of distor-
tions are determined by the original spectral method (rule). After their determination, the smear/defocus
of images is removed via solving integral equations by the Tikhonov regularization method with in-
creased accuracy due to the use of the spectral method. After that, the noise is removed by the median
filter.
2. A new problem is considered, viz, the problem of "triple distortion", when the image is simulta-
neously noisy, smeared, and defocused. The Fourier transform (Fourier spectrum) of the total image is
obtained as an overlay of spectra. The spectral method allows determining the types and parameters of
distortion components, but with a lower accuracy than when processing the distortions separately.
3. The results of this paper can be used to improve the quality of images restoration with complex
distortion, for example, "triple distortion", when the image contains noise, smear, and defocus. This
will increase the resolution of optical image registration devices (shooting cameras, tracking systems,
cameras, etc.).
This work was supported by the Government of the Russian Federation (grant 08-08), as well as by
a grant of MFKTU ITMO (project 718546).
6. References
[1] Gonzalez, R.C., Woods, R.E., Digital Image Processing, 2nd. ed., Prentice Hall, Upper Saddle
River, 2002.
[2] Sizikov, V.S., Estimating the point-spread function from the spectrum of a distorted tomographic
image, J. Optical Technology 82(10) (2015) 655–658. doi: 10.1364/JOT.82.000655.
[3] Sizikov, V.S., Spectral method for estimating the point-spread function in the task of eliminating
image distortions, J. Optical Technology 84(2) (2017) 95–101. doi: 10.1364/JOT.84.000095.
[4] Sizikov, V.S., Direct and Inverse Problems of Image Restoration, Spectroscopy and Tomography
with MatLab, Lan' Publ., St. Petersburg, 2017 (in Russian).
[5] Sizikov, V.S., Stepanov, A.V., Mezhenin, A.V., Burlov, D.I., Eksemplyarov, R.A., Determining
image-distortion parameters by spectral means when processing pictures of the earth’s surface ob-
tained from satellites and aircraft, J. Optical Technology 85(4) (2018) 203–210. doi:
10.1364/JOT.85.000203.
[6] Vasilenko, G.I., Taratorin, A.M., Image Reconstruction, Radio i Svyaz’, Moscow, 1986 (in Rus-
sian).
[7] Bates, R.H.T., McDonnell, M.J., Image Restoration and Reconstruction, Oxford U. Press, Oxford,
1986.
[8] Gruzman, I.S., Kirichuk, V.S., Kosykh, V.P., Peretyagin, G.I., Spektor, A.A., Digital Image Pro-
cessing in Information Systems, NSTU Publ., Novosibirsk, 2002 (in Russian).
[9] Sizikov, V.S., Inverse Applied Problems and MatLab, Lan' Publ., St. Petersburg, 2011 (in Rus-
sian).
[10] Sizikov, V.S., Éksemplyarov, R.A., Operating sequence when noise is being filtered on distorted
images, J. Optical Technology 80(1) (2013) 28–34. doi: 10.1364/JOT.80.000028.
[11] Gonsales, R.C., Woods, R.E., Eddins, S.L., Digital Image Processing using MATLAB, Prentice
Hall, New Jersey, 2004.
[12] Fergus, R., Singh, B., Hertzmann, A., et al., Removing camera shake from a single photograph,
ACM Trans. Graphics (TOG) 25(3) (2006) 787–794. doi: 10.1145/1141911.1141956.
[13] Yushikov, V.S., Blind deconvolution – automatic restoration of blurred images, URL: https:
//habr.com/ru/post/175717/.
[14] Bishop, T.E., et al., Blind image deconvolution: Problem formulation and existing approaches. In:
Campisi, P., Egiazarian, K. (eds.), Blind Image Deconvolution: Theory and Applications, Boca
Raton, London–New York, 2007, pp. 1–41.
[15] Yan, L., Liu, H., Zhong, S., Fang, H., Semi-blind spectral deconvolution with adaptive Tikhonov
regularization. Applied Spectroscopy 66(11) (2012) 1334–1346. doi: 10.1366/11-06256.
[16] Tikhonov, A.N., Goncharsky, A.V., Stepanov, V.V., Inverse problems of photoimages processing.
In: Tikhonov, A.N., Goncharsky, A.V. (eds.). Ill-Posed Problems in Natural Science. MSU Publ.,
Moscow, 1987, pp. 185–195 (in Russian).
[17] Hansen, P.C., Discrete Inverse Problems: Insight and Algorithms, SIAM, Philadelphia, 2010.
[18] Leonov, A.S., Solving Ill-Posed Inverse Problems: A Sketch of the Theory, Practical Algorithms
and Demonstrations in MATLAB, Knizhny Dom “LIBROKOM”, Moscow, 2010 (in Russian).
[19] Protasov, K.T., Belov, V.V., Molchunov, N.V., Image reconstruction with pre-estimation of the
point-spread function, Optics Atmos. Okeana 13(2) (2000) 139–145.
[20] Voskoboinikov, Yu.E., A combined nonlinear contrast image reconstruction algorithm under in-
exact point-spread function, Optoel. Instrum. Data Proces. 43(6) (2007) 489–499. doi:
10.3103/S8756699007060015.
[21] Antonova, T.V., Methods of identifying a parameter in the kernel of the first kind equation of the
convolution type on the class of functions with discontinuities, Siberian J. Numer. Mathem. 18(2)
(2015) 107–120. doi: 10.15372/SJNM20150201.
[22] Sidorov, D., Integral Dynamical Models: Singularities, Signals and Control, World Sci. Publ., Sin-
gapore–London, 2014.
[23] D’yakonov, V., Abramenkova, I., MATLAB. Signal and Image Processing, Piter, St. Petersburg,
2002 (in Russian).
[24] Jähne, B., Digital Image Processing, Springer, Berlin, 2005.
[25] Landsberg, S.G., Optics, 6th ed., Fizmatlit, Moscow, 2006 (in Russian).
[26] Bracewell, R.N., The Hartley Transform, Oxford University Press, New York, 1986.
[27] Sizikov, V.S., Dovgan, A.N., Tsepeleva, A.D., Restoration of nonuniformly smeared images, J.
Optical Technology 87(2) (2020) 110-116. doi: 10.1364/JOT.87.000110.
[28] Lim, J.S., Two-dimensional Signal and Image Processing, Prentice Hall PTR, New Jersey, 1990.