<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>The Real-Time Rendering of Gradient Media</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Sergei Riabov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andrey Zhdanov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Dmitry Zhdanov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ildar Valiev</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>ITMO University</institution>
          ,
          <addr-line>Kronverksky Pr. 49, bldg., St. Petersburg, 197101</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Keldysh Institute of Applied Mathematics</institution>
          ,
          <addr-line>Miusskaya sq. 4, Moscow, 125047</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In the real world, there are optical media that are difficult or impossible to simulate with classical real-time rendering methods. One of such media types is a gradient medium, in which light rays propagate along a curve. However, with advances in the capabilities of the real-time computer graphics rendering hardware, more sophisticated rendering algorithms, e.g., rendering based on the ray tracing technologies which uses a physically correct model of the light propagation and transformation, are becoming more widely used to achieve more realistic images. In the scope of the current article, the authors researched the capabilities of the current accelerated ray tracing-based realistic rendering technologies to perform the physically correct rendering of scenes containing the gradient medium. The presented method can operate in realtime which was proven by presenting the test images and aminations acquired from the implementation of the designed realistic rendering method of the gradient media with NVIDIA ORCID: 0000-0001-7559-0015 (S. Riabov); 0000-0002-2569-1982 (A. Zhdanov); 0000-0001-7346-8155 (D. Zhdanov); 0000-0003-2937-</p>
      </abstract>
      <kwd-group>
        <kwd>Keywords1</kwd>
        <kwd>Rendering</kwd>
        <kwd>ray tracing</kwd>
        <kwd>gradient media</kwd>
        <kwd>NVIDIA OptiX</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>OptiX.</p>
    </sec>
    <sec id="sec-2">
      <title>1. Introduction</title>
      <p>

altitude.</p>
      <p>With advances in the capabilities of real-time computer graphics rendering hardware, sophisticated
rendering algorithms are becoming more widely used to achieve more realistic images. Nowadays, the
approach to real-time rendering based on raytracing technologies that use a physical model of light
propagation and transformation is gaining popularity.</p>
      <p>In the real world, there are optical media that are difficult or impossible to simulate with classical
real-time rendering methods. One of the types of such media is a gradient medium, in which light rays
propagate along a curve, which complicates their modeling. Gradient media are characterized by an
inhomogeneous distribution of the refractive index value. Examples of these environments are:
the air heated by a hot body (e.g., the air above a fire, heated asphalt, or sand in the desert);
the areas of mixing the water currents of different temperature and/or salinity;
the water around the sugar cube dissolving in it;
the atmosphere of the earth, as the air pressure and density, are changing with increasing
Physically accurate rendering of such optical effects can be applied in various fields. At first, it can
be used to generate realistic images in computer games and movies. Another application of a physically
accurate rendering model is data generation for machine learning algorithms. Often, machine learning
algorithms use data that is not obtained from the real world but instead is artificially generated. The
implementation of models of complex optical phenomena allows the creation of specific data that helps
systems using machine learning work correctly in special situations.</p>
      <p>EMAIL:
andrew.gtx@gmail.com
(A.</p>
      <p>Zhdanov);
️©</p>
      <p>2021 Copyright for this paper by its authors.</p>
      <p>Also, such models allow you to create test environments for testing in the early stages of systems,
which testing is difficult in real conditions. An example of such systems is the camera lens systems
designed to be used on satellites since the earth's atmosphere is a gradient medium.</p>
      <p>
        When solving the problem of light propagation in aт inhomogeneous environment, it is necessary to
focus on the target audience. If physically correct modeling is required, then, depending on the problem,
one can use either the wave optics solution [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] or the eikonal equation solved by the well-known ray
methods [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. These solutions are used in computational optics, but they have a very limited
application area, e.g., gradient lenses, which usually have only one surface which can be intersected by
the beam of rays. These methods allow finding a physically correct solution, however, they are quite
ineffective if applied to scenes with thousands of geometric primitives that constrain the gradient
environment.
      </p>
      <p>
        Other solutions are usually artificial by their nature, or the non-rectilinearity of the ray trace path is
caused by completely different effects. In [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], a non-rectilinear ray tracing method was proposed. The
environment is defined by an equation of motion in the form of the ordinary differential equation
(ODE). This equation is set by the user and can be solved by any explicit numerical integration scheme.
For simplicity, the first-order Euler integration is considered, which, in our opinion, is not entirely
correct for a physically correct model of a gradient medium. The authors consider a number of models
to describe a medium, but they have nothing in common with a model of a real medium and the eikonal
equation. In addition, the iterative approach to the calculation of the possible point of intersection with
the boundary surface is not considered.
      </p>
      <p>
        Works [
        <xref ref-type="bibr" rid="ref5 ref6">5, 6</xref>
        ] are mainly demonstrative by their nature and are aimed at allowing the audience to
study cosmic phenomena that affect the non-rectilinear propagation of light and how this effect can be
visualized. The older work [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], which introduces the concept of non-rectilinearity of light propagation
due to the action of some external forces, such as gravity, can be attributed to a similar demonstrative
paper kind. The author does not limit his solution to the problems of modeling light phenomena; in his
opinion, this solution can also be applied to modeling dynamic systems. The same is with work [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ],
which shows that not only Euler's method but also the Runge-Kutta method can be used to simulate the
ray path in a medium defined by the equation of motion in the form of the ODE system. However,
neither of these works considers the physical nature of the non-rectilinear propagation of light,
determined by the eikonal equation in the geometrical optics.
      </p>
      <p>
        The scattering of light in the medium can be also attributed to the problems of non-rectilinear light
propagation. From the software implementation point of view, [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] is rather interesting and detailed
work, however, this work does not consider continuous ray paths, the models are reduced to simplified
models of light scattering on particles, optimal sampling methods, and the construction of beautiful
visual effects that may be far from physical reality. Methods for determining areas of rectilinear light
propagation can be correctly considered only for a scattering medium, but these approaches are in noе
applicable for a continuous gradient medium. The same remark can be applied to the works [10, 11],
which consider the solution of the rendering equation in an inhomogeneous medium by the Monte Carlo
method. The methods for determining the optimal "rectilinear" distance are physically correct only for
media with volume scattering.
      </p>
      <p>The current work aims to design and implement a computer model of light propagation in a gradient
medium based on physically correct solution and real-time ray tracing methods. In addition, the tasks
of the work include researching a mathematical model of light propagation, as well as calculating
illumination in a gradient medium.</p>
    </sec>
    <sec id="sec-3">
      <title>2. Gradient media model</title>
      <p>To design a computer model of the light propagation in a gradient medium, it is necessary to solve
two problems. At first, it is the calculation of the ray path in the gradient medium and, secondly, the
calculation of the luminance values in the scene containing the gradient media.</p>
      <p>
        In the vector form, the geometrical optics approximates the light propagation in the gradient medium
with eikonal equation [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ] that is expressed as:
  ⃗ ( )
] = ∇ ( ⃗ ( ))
(1)
where  ( ⃗ ( )) is the medium refraction index at the  ⃗ ( ) point,  ⃗( ⃗ ( )) =
vector of the light propagation in the gradient medium,  is the curved ray path length. In a
homogeneous medium, the refraction index does not depend on the position and as result, the ray path
  ⃗( ) is the unit direction
becomes a straight line.
      </p>
      <p>The simplest solution would be the discretization of the gradient media to areas with a constant
refraction index. As result, the ray path would become a set of straight segments with transformation
on the boundaries of the media layers. Despite the attractive simplicity of this approach, it might lead
either to a significant error in the resulting ray path or to slowing down the ray tracing because of an
increasing number of segments. In addition, in case if the gradient medium has complex properties it
might result in an additional complex task to find the internal boundaries between layers of the gradient
medium. This approach is shown in Figure 1.
  ⃗ ( )
{⃗⃗ ( ⃗ ( )) =  ( ⃗ ( ))∇ ( ⃗ ( ))
=  ( ⃗ ( )) ⃗( ⃗ ( )) =  ( ⃗ ( ))
  ⃗ ( )
where  is the parameter of the ray path segment length, ⃗⃗ ( ⃗ ( )) is the optical ray path vector, and
⃗⃗ ( ⃗ ( )) is the vector of refractive index variation.</p>
      <p>
        After substituting variables (3) into an equation (2) we would get the following equation:
which is the first-order differential equation that could be solved numerically using the Runge-Kutta
method [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. We suppose that ray in the gradient medium is propagated with portions corresponding to
some Δ  step, where i is a step-index. For this need additional variables are introduced:
 ⃗ = Δ ⃗⃗ ( ⃗ ( ))
 ⃗⃗ = Δ ⃗⃗ ( ⃗ ( ) +
      </p>
      <p>⃗⃗ ( ⃗ ( )) +
 ⃗ = Δ ⃗⃗ ( ⃗ ( ) + Δ ⃗⃗ ( ⃗ ( )) +
Δ
2
Δ  ⃗)
8
Δ
2</p>
      <p>⃗⃗ )
{
⃗⃗ ( ⃗ +1( )) = ⃗⃗ ( ⃗ ( )) +</p>
      <p>( ⃗ + 4 ⃗⃗ +  ⃗ )
 ⃗ +1( ) =  ⃗ ( ) + Δ [⃗⃗ ( ⃗ ( )) +</p>
      <p>( ⃗ + 2 ⃗⃗ )]
1
6
1
6
{
 ⃗ +1( ) =
⃗⃗ ( ⃗ +1( ))
 ( ⃗ +1( ))</p>
      <p>Then with help of these variables, the new parameters of the ray (origin, direction, and eikonal) for
the next ray path point are approximated:</p>
      <p>These steps are repeated until the next ray point ( ⃗ +1( )) appears outside the gradient medium
geometry. Then the ray returns to the previous position  ⃗ ( ) and reiterations between  ⃗ +1( ) and  ⃗ ( )
continue until the desired approximation tolerance between  ⃗ ′( ) point and scene geometry is
achieved.
2.1.</p>
    </sec>
    <sec id="sec-4">
      <title>Ray path in the gradient medium</title>
      <p>The calculation of the ray path is performed using numerical methods. In this case, the calculation
is performed by splitting the entire ray path into smaller segments according to the method described
above. So, the algorithm for calculating the ray path consists of the following steps:
1. obtain the initial parameters of the ray origin and direction after hitting the gradient medium;
2. choose a certain parameter t, which is responsible for the length of the ray path segment;
3. calculate the new ray origin and direction at the end of the segment;
4. check the intersection of the calculated segment with the scene objects: in the case if the
intersection was found (that is if a rectilinear ray segment intersects the scene geometry) then
proceed to step 5; otherwise, the ray stays in the gradient medium, so go back to step 1 with next
values of the ray origin and direction;
5. find the intersection of the calculated segment with the boundaries of the gradient medium. In
case if the intersection was found use the bisection method to find the coordinates of point where
ray leaves the gradient medium;
6. leave the gradient medium.</p>
      <p>Steps two to five are repeated until the ray leaves the gradient medium. After leaving the gradient
medium the ray is traced further in a standard mode. The described above steps are schematically shown
in Figure 2.</p>
      <p>To implement the ray tracing in the gradient medium the gradient medium program interface should
implement methods that would:
 choose the optimal ray travel parameter from the point inside the medium in the specified
direction. If the medium does not have the gradient properties this value should be returned as
infinite and standard mode ray tracing should proceed;
 calculate the medium refractive index at the specified point inside the medium;
 calculate the medium refractive index gradient and Laplacian at the specified point inside the
medium;
 transfer the ray between two points inside the medium and calculate new ray direction at the
endpoint;
 calculate the optical and geometric path of the ray between two points inside the medium;
 calculate the absorption of the ray when it travels between two points inside the medium.
2.2.</p>
    </sec>
    <sec id="sec-5">
      <title>Luminance value estimation in the gradient medium</title>
      <p>When ray propagates in the gradient medium and finds the point of intersection with the gradient
medium boundary a problem of estimating the luminance value might appear if this point is illuminated
through the gradient medium boundary, for example, if a light source is located inside the gradient
medium. The possible solutions are considered below, and the limitations of the approach used in the
current work are defined.</p>
      <p>When using the ray-tracing method for rendering a scene image, the fastest way to calculate the
illumination of a scene point is by tracing rays from that point to each of the scene lights. So, when a
traced ray hits any of the scene objects, rays are traced from the point of impact to all light sources. If
such a ray hits any other object on the path to the light source, that means that the light source is
shadowed by this object. However, the presence of a gradient medium environment in the scene creates
problems for this method of lighting calculations. If, when tracing a ray from a point to a light source,
the ray falls into a gradient medium, then its path will be distorted, due to which the ray may intersect
another object, which may cause a shadow at this point. However, there may be a different ray path
between the light source and the target point that does not intersect other objects in the scene. As a
result, illumination might be calculated incorrectly.</p>
      <p>So, in the case of a gradient medium, the direct illumination changes to the caustic illumination [12],
and the BDF sampling method (the rays are randomly sampled at the gradient medium boundary and
the search for an intersection with the scene light sources is performed; if the intersections are found
then the luminance values are taken from the lights) should be used to account for it. However, the BDF
sampling is not applicable for point of parallel light sources, which are most commonly used in video
games, which makes this solution fairly useless. The simplest solution to this problem may be to ignore
the gradient environment when calculating the direct illumination. In this case, a standard calculation
of illumination occurs – when a ray hits the surface of an object, rays are traced directly to the light
sources from the point of impact, and when these rays intersect with the gradient medium, the rays
simply continue to propagate in the original direction. Obviously, this method leads to a less realistic
image, however, quite often the gradient environment can have a negligible effect on the illumination
in the scene, and the result may be acceptable. So, for the designed algorithm we used this simplified
method of the direct luminance calculation.</p>
      <p>To calculate more realistic lighting, the photon mapping method [13] can be used. The photon
mapping method can be used to account for both direct, indirect, and caustic types of illumination by
adding variable radius photons for different kinds of illumination [14], however, this method of
calculating illumination is much more difficult to implement and it reduces the performance of ray
tracing in a gradient environment, but it allows to get a more realistic rendering model, in contrast to
the first approach. The main problem of the photon mapping approach is the high computation load
caused by it that makes it hardly possible to be implemented in real-time using modern hardware. As
result when designing the algorithm, we chose to use a simplified illumination scheme and do not
consider any solutions for the indirect illumination.
2.3.</p>
    </sec>
    <sec id="sec-6">
      <title>Implementation</title>
      <p>When implementing algorithms using the modern accelerated ray-tracing technologies there are
several most commonly used APIs, these are DirectX Raytracing (DXR) [15], Vulkan [16], and
NVIDIA OptiX [17]. The first two are aimed mainly at real-time applications with gaming applications,
while NVIDIA OptiX [18] is aimed at the higher quality of the rendered image. So, we used the OptiX
API for the implementation of the described above algorithms.</p>
      <p>Using the OptiX APIwe implemented the special spherical object with a gradient medium in which
the refraction index can be functionally defined depending on a point position inside the sphere.</p>
      <p>The algorithm described above was implemented using the OptiX API, and the parts of the source
code related to gradient medium implementation are shown below. Listing 1 contains the
implementation of an arbitrary method used to calculate the optical ray, refractive index gradient, and
segment travel distance. The following idea is used to calculate the segment travel distance ( ): as we
know the minimumand maximum values of the refractive index, we know the range that the derivative
of the refractive index should lay in. This allows us to use the derivative of the refractive index in the
direction of the ray transfers to scale the derivative value into the range between predefined minimum
and maximum distance values so that the higher is the gradient value the smaller is the segment length.</p>
      <sec id="sec-6-1">
        <title>Listing 1</title>
        <p>Approximationoftheoptical ray, refractiveindexvariation, and deltavalues
// The constant delta value for gradient approximation
static const float delta = 0.001f;
// Get the medium optical ray
float3 getMediumOpticalRay(float3 position, float3 direction,
const Medium&amp; medium)
{
return medium.getMediumRefraction(position) * direction;
}
// Get the medium refraction variation
float3 getMediumRefractionVariation(float3 position,</p>
        <p>const Medium&amp; medium)
{
return medium.getMediumRefraction(position) *</p>
        <p>medium.getMediumGradient(position);
}
// Choosing the delta value for the segment
float getMediumDelta(float3 position, float3 direction,</p>
        <p>const Medium&amp; medium)
{
// Ray deviation from medium gradient
float dir_der =</p>
        <p>abs(dot(direction, medium.getMediumGradient(position)));
// Normalization to the medium variation and reverse to 1
float grad01 = clamp(1.0f - dir_der / medium.amplitude, 0.0f, 1.0f);
// Interpolate between min and max delta on medium variation
// in the direction
float delta = interpolate(medium.min_t, medium.max_t, grad01);
return delta;
}</p>
        <p>Listing2containsthe implementationof aniterationofcalculatingthenext rayorigin and direction.
Single iterationof therayin a gradientmedium transformation</p>
        <p>RayTransform iterateMediumRayTransform(RayTransform previous_transform,
float delta, const Medium&amp; medium)
{
const float3 position = previous_transform.position;
const float3 direction = previous_transform.direction;
float3 optical_ray = getMediumOpticalRay(position, direction);
float3 refraction_variation = getMediumRefractionVariation(position);
// Calculation of Runge-Kutta coefficients
float3 a = delta * refraction_variation;
float3 b = delta * getMediumRefractionVariation(position +
delta / 2.0f * optical_ray + delta / 8.0f * a);
float3 c = delta * getMediumRefractionVariation(position +
delta * optical_ray + delta / 2.0f * b);
float3 next_optical_ray = optical_ray + (a + 4 * b + c) / 6.0f;
float3 next_position =</p>
        <p>position + delta * (optical_ray + (a + 2 * b) / 6.0f);
float3 next_direction =</p>
        <p>next_optical_ray / medium.getMediumRefraction(next_position);
previous_transform.position = next_position;
previous_transform.direction = next_direction;
return previous_transform;
}</p>
        <p>Listing 3 contains the implementation of ray hit the gradient boundary or other object inside the
gradient medium check.</p>
      </sec>
      <sec id="sec-6-2">
        <title>Listing 3</title>
        <p>Checkof ray hittingthe gradient medium boundary
// Start and end ray positions in the world coordinate system
float3 start_position_world =</p>
        <p>localToWorldCoordinates (current_ray_transform.position, sphere);
float3 end_position_world =</p>
        <p>localToWorldCoordinates (next_ray_transform.position, sphere);
// Ray direction from start to end position in the world coordinate
system and pay path
float3 direction_world = end_position_world - start_position_world;
float max_length = length(direction_world);
direction_world = direction_world / max_length;
// Trace ray from the start position to end point on defined length
RadiancePRD new_prd = traceRadianceRay(start_position_world,
direction_world, 0.0f, max_length, prd.depth, prd.importance,
FROM_GRADIENT);
// Check if ray hits any object inside the gradient medium
if (new_prd.hit_type == FROM_GRADIENT)
{</p>
        <p>Listing 4 contains the implementation of the ray exiting the gradient medium point approximation
check. The iterations would continue until the desired approximation quality is achieved.</p>
      </sec>
      <sec id="sec-6-3">
        <title>Listing 4</title>
        <p>Approximationoftheendray positionwhen leavingthe gradientmedium check
// The "importance" parameter is used to pass the ray path length
float ray_length = new_prd.importance;
float t = ray_length / max_length;
// if the accuracy or maximal number of iterations is achieved
if (iterations &lt; medium.max_iterations &amp;&amp;</p>
        <p>delta_multiplier &gt; medium.boundary_precision)
{
delta_multiplier *= t * 0.95f;
}
else
{
float3 interpolated_position =</p>
        <p>interpolate(start_position_world, end_position_world, t);
float3 interpolated_direction =
interpolate(current_ray_transform.direction,</p>
        <p>next_ray_transform.direction, t);
new_prd = traceRadianceRay(interpolated_position,
interpolated_direction, params.scene_epsilon, 1e16f,
prd.depth, prd.importance, HIT_OBJECT);
prd.result = new_prd.result;
prd.hit_type = new_prd.hit_type;
break;
}</p>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>3. Results</title>
      <p>To test the approach described above we used the modified scene from the OptiX package. This
scene contains a rectangular surface and two balls above it which are made of glass and steel. The
sphere with a gradient medium was placed in front of these two balls partly covering them from the
observer. The testing was performed on the mobile NVIDIA GeForce RTX 2060 GPU. The test scene
is shown in Figure 3.</p>
      <p>In Figure 4 the graph of the refraction index variation depending on the distance from the center of
the sphere with the gradient medium is shown. In (a) case it has maximum value in the center of the
spere and drops to environment refractive index when approaching the border of the refractive media
area. In (b) case it has the opposite gradient direction. The corresponding rendering results are shown
in Figure 5. The gradient medium area is highlighted with the white circle. The rendering was performed
with 30fps that meets the real-time rendering criterion.</p>
      <p>The method was also tested with a procedurally created medium gradient defined by the Perlin noise.
The rendering results are shown in Figure 6. The rendering performance was 15fps, however, the
decrease of speed was caused mainly by the low efficiency of the used implementation of the Perlin
noise generator.</p>
    </sec>
    <sec id="sec-8">
      <title>5. Acknowledgements</title>
    </sec>
    <sec id="sec-9">
      <title>6. References</title>
      <p>The research was partially supported by state financial support of RFBR grant No. 19-01-00435.</p>
    </sec>
    <sec id="sec-10">
      <title>4. Conclusion</title>
      <p>In the scope of the current research, the gradient medium model was implemented to be rendered in
real-time using NVIDIA OptiX API. The further research direction would be methods for more
effective choice of the ray travel path distance for faster ray travel in the gradient medium and the
realistic illumination of the scenes containing the gradient medium in real-time by methods of forward
or backward photon mapping.
[10] S. Marschner, Volumetric Path Tracing, 2015. URL:
https://www.cs.cornell.edu/courses/cs6630/2015fa/notes/10volpath.pdf
[11] P. Shirley, Ray Tracing: The Next Week, 2020. URL:
https://raytracing.github.io/books/RayTracingTheNextWeek.html#volumes
[12] M. A. Shah, J. Konttinen, S. Pattanaik, Caustics Mapping: An Image-Space Technique for
RealTime Caustics, Transactions on Visualization and Computer Graphics 13 (2007) 272–280.
doi:10.1109/TVCG.2007.32
[13] H.W. Jensen, P. Christensen, High quality rendering using ray tracing and photon mapping, in:</p>
      <p>ACM SIGGRAPH 2007 courses, 2007. doi:10.1145/1281500.1281593
[14] A.D Zhdanov, D.D. Zhdanov, Progressive backward photon mapping, Programming and</p>
      <p>Computer Software 47(3) (2021) 185–193.
[15] Microsoft, DirectX Raytracing (DXR) Functional Spec, 2021. URL:
https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html
[16] G. Sellers, J. Kessenich, Vulkan programming guide: The official guide to learning vulkan,</p>
      <p>Addison-Wesley Professional, 2016.
[17] How to Get Started with OptiX 7, 2021, URL:
https://developer.nvidia.com/blog/how-to-getstarted-with-optix-7/
[18] S.G. Parker, J. Bigler, A. Dietrich A, H. Friedrich, J. Hoberock, D. Luebke, D. McAllister, M.</p>
      <p>McGuire, K. Morley, A. Robison, M. Stich, Optix: a general purpose ray tracing engine, Acm
transactions on graphics (TOG) 29(4) (2010) 1–3. doi: 10.1145/1778765.1778803</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Born</surname>
          </string-name>
          , E. Wolf, Foundations of geometrical optics, M. Born (Ed.),
          <article-title>Principles of optics: electromagnetic theory of propagation, interference and diffraction of light</article-title>
          , Cambridge University Press, Cambridge,
          <year>1999</year>
          . pp.
          <fpage>134</fpage>
          -
          <lpage>135</lpage>
          . doi:
          <volume>10</volume>
          .1017/CBO9781139644181
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Zhdanov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Zhdanov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Sokolov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.S.</given-names>
            <surname>Potemin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ershov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Galaktionov</surname>
          </string-name>
          ,
          <article-title>Problems of the realistic image synthesis in media with a gradient index of refraction</article-title>
          , in: Optical Design and
          <string-name>
            <surname>Testing</surname>
            <given-names>X</given-names>
          </string-name>
          ,
          <source>International Society for Optics and Photonics</source>
          ,
          <year>2020</year>
          , Vol.
          <volume>11548</volume>
          , pp.
          <fpage>115480W</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.C.</given-names>
            <surname>Butcher</surname>
          </string-name>
          ,
          <article-title>On the implementation of implicit Runge-Kutta methods</article-title>
          ,
          <source>BIT Numerical Mathematics</source>
          <volume>16</volume>
          (
          <issue>3</issue>
          ) (
          <year>1976</year>
          )
          <fpage>237</fpage>
          -
          <lpage>40</lpage>
          . doi:
          <volume>10</volume>
          .1007/BF01932265
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>D.</given-names>
            <surname>Weiskopf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Schafhitzel</surname>
          </string-name>
          , T. Ertl,
          <string-name>
            <surname>GPU-Based Nonlinear</surname>
          </string-name>
          Ray Tracing,
          <source>Computer Graphics Forum</source>
          <volume>23</volume>
          (
          <year>2004</year>
          )
          <fpage>625</fpage>
          -
          <lpage>633</lpage>
          . doi:
          <volume>10</volume>
          .1111/j.1467-
          <fpage>8659</fpage>
          .
          <year>2004</year>
          .
          <volume>00794</volume>
          .x
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>O.</given-names>
            <surname>James</surname>
          </string-name>
          , E. von Tunzelmann,
          <string-name>
            <given-names>P.</given-names>
            <surname>Franklin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.S.</given-names>
            <surname>Thorne</surname>
          </string-name>
          ,
          <article-title>Gravitational lensing by spinning black holes in astrophysics, and in the movie Interstellar</article-title>
          ,
          <source>Classical and Quantum Gravity</source>
          <volume>32</volume>
          (
          <issue>6</issue>
          ) (
          <year>2015</year>
          )
          <article-title>065001</article-title>
          . doi:
          <volume>10</volume>
          .1088/
          <fpage>0264</fpage>
          -9381/32/6/065001
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>O.</given-names>
            <surname>James</surname>
          </string-name>
          , E. von Tunzelmann,
          <string-name>
            <given-names>P.</given-names>
            <surname>Franklin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.S.</given-names>
            <surname>Thorne</surname>
          </string-name>
          , Visualizing Interstellar's wormhole,
          <source>American Journal of Physics</source>
          <volume>83</volume>
          (
          <issue>6</issue>
          ) (
          <year>2015</year>
          )
          <fpage>486</fpage>
          -
          <lpage>499</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>E.</given-names>
            <surname>Gröller</surname>
          </string-name>
          ,
          <article-title>Nonlinear ray tracing: Visualizing strange worlds</article-title>
          ,
          <source>The Visual Computer</source>
          <volume>11</volume>
          (
          <year>1995</year>
          )
          <fpage>263</fpage>
          -
          <lpage>274</lpage>
          . doi:
          <volume>10</volume>
          .1007/BF01901044
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>D.</given-names>
            <surname>Weiskopf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Schafhitzel</surname>
          </string-name>
          , T. Ertl,
          <string-name>
            <surname>GPU-Based Nonlinear</surname>
          </string-name>
          Ray Tracing,
          <source>Computer Graphics Forum</source>
          <volume>23</volume>
          (
          <year>2004</year>
          )
          <fpage>625</fpage>
          -
          <lpage>633</lpage>
          . doi:
          <volume>10</volume>
          .1111/j.1467-
          <fpage>8659</fpage>
          .
          <year>2004</year>
          .
          <volume>00794</volume>
          .x
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>J.</given-names>
            <surname>Fong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wrenninge</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Kulla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Habel</surname>
          </string-name>
          , Production volume rendering,
          <source>SIGGRAPH 2017 course, in: ACM SIGGRAPH 2017 Courses (SIGGRAPH '17)</source>
          ,
          <year>2017</year>
          , Article 2, pp.
          <fpage>1</fpage>
          -
          <lpage>79</lpage>
          . doi:
          <volume>10</volume>
          .1145/3084873.3084907
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>