=Paper= {{Paper |id=None |storemode=property |title=VisIVO Science Gateway: a Collaborative Environment for the Astrophysics Community |pdfUrl=https://ceur-ws.org/Vol-993/paper1.pdf |volume=Vol-993 |dblpUrl=https://dblp.org/rec/conf/iwsg/SciaccaBBCKMPPRV13 }} ==VisIVO Science Gateway: a Collaborative Environment for the Astrophysics Community== https://ceur-ws.org/Vol-993/paper1.pdf
             VisIVO Science Gateway: a Collaborative
           Environment for the Astrophysics Community

Eva Sciacca ∗ , Marilena Bandieramonte ∗‡ , Ugo Becciani ∗ , Alessandro Costa ∗ , Mel Krokos † , Piero Massimino ∗ ,
                      Catia Petta ‡ , Costantino Pistagna ∗ , Simone Riggi ∗ and Fabio Vitello ∗
                                               ∗ INAF-Osservatorio Astrofisico di Catania, Italy
                                                  † University of Portsmouth, United Kingdom
                                      ‡ Dipartimento di Fisica e Astronomia, Universitá di Catania, Italy

                                                        Email: eva.sciacca@oact.inaf.it

    Abstract—VisIVO Science Gateway is a web based, workflow             presented in [4], focusing on some complex case studies
enabled environment wrapped around a WS-PGRADE/gUSE                      to support specialized astrophysics communities (see Section
portal integrating seamlessly large-scale multi-dimensional as-          V) which are managed through a workflow sharing frame-
trophysical datasets with applications for processing and visual-        work (see Section IV). Our gateway is wrapped around WS-
ization based on Distributed Computing Infrastructures (DCIs).           PGRADE [5], a highly-flexible interface for the grid User
We present the main tools and services supported including an
application for mobile access to the gateway. We discuss issues
                                                                         Support Environment2 (gUSE) and provides access to VisIVO
in sharing workflows and report our experiences in supporting            Server tools [6] (see Section II), thus enabling execution
specialised communities. We present a number of workflows de-            of complex workflows through a comprehensive collection
veloped recently for visualization and numerical simulations and         of modules for processing and visualization of astrophysical
outline future workflows currently under development. Finally,           datasets.
we summarise our work on the gateway with pointers to future
developments.                                                                 A number of customized workflows is configured by de-
                                                                         fault to allow local or remote uploading of datasets, datasets
   Keywords—Science Gateways; Workflow Systems; Collaborative            filtering and creation of scientific movies. These workflows
Environments; Astrophysics; Large-Scale Datasets; Visualization;         are provided with specific user interface portlets to enable
DCIs
                                                                         intuitive parameter setting for standard users while hiding the
                                                                         complexity of the underlying system and infrastructures. The
                            I.    I NTRODUCTION                          mobile application employs user accounts from the gateway
    Visualization can play an important role in the context              and offers a handy platform for astrophysical communities to
of large-scale multi-dimensional astrophysical datasets, e.g.            share results and experiences of analysis and exploration of
in understanding, interpreting and verifying their intrinsic             their datasets.
characteristics [1]. Often a number of data exploration tools                For displaying 2D or 3D plots, astrophysicists typically
are employed for visual discovery in order to identify regions           deploy software packages programs such as Gnuplot, Super-
of interest within which to apply computationally expensive              Mongo, or scripting languages such as Python, Matlab or
algorithms (e.g. see [2]). Such processes typically involve              IDL. VisIt3 or ParaView4 offer a combination of 2D and 3D
distributed solutions for storage and processing. Recently sci-          plotting capabilities, real-time and offline analysis, scripting
ence gateways have gained popularity as they allow seamless              and graphical control. VisIt has been provided with grid
integration of datasets, tools and applications enabled for              services for scientific collaborative visualization in UNICORE
executing on generic distributed computing infrastructures (or           Grids [7]. ParaView has been extended to offer grid services [8]
DCIs).                                                                   and a plugin has been developed to provide interactive remote
    Science gateways provide services to support searching,              visualization for collaborative environments based on video
managing and uploading/downloading (thus allowing sharing)               streams [9].
of applications and datasets. They enable user communities
                                                                             Nevertheless scientific visualization can be a fairly complex
to deploy their applications through common graphical user
                                                                         process involving several steps, e.g. filtering data, choosing a
interfaces, thus allowing scientists to focus on the actual ap-
                                                                         representation and desired level of interactivity and customiz-
plications instead of learning and managing the required infras-
                                                                         ing the manner in which the data is displayed. None of the
tructures. The processes supported by gateways are organized
                                                                         aforementioned tools are provided with a science gateway to
as scientific workflows [3] that explicitly specify dependencies
                                                                         interface them with workflow services. Within VisIVO Science
among underlying tasks for orchestrating distributed resources
                                                                         Gateway and VisIVO Mobile ready to-use workflows can be
appropriately.
                                                                         downloaded, parametrized and executed under a controlled
   This paper reports on the on-going developments of Vi-
sIVO Science Gateway1 and VisIVO Mobile application, first                 2 http://www.guse.hu
                                                                           3 https://wci.llnl.gov/codes/visit
  1 http://visivo.oact.inaf.it:8080                                        4 http://www.paraview.org
environment. The visualization and filtering parameters can         Fig. 1.   VisIVO Server processing pipeline.
be chosen interactively and the workflow configuration and
submission to DCIs is performed without exposing technical
details so that end users can focus on their applications instead
of devoting efforts in learning and managing the underlying
infrastructures.


                      II.   V ISUALIZATION T OOLS
    VisIVO [6] is an integrated suite of tools and services
for effective visual discovery within large-scale astrophysical
datasets. VisIVO is realised as:

    •      VisIVO Desktop [10], a stand alone application for
           interactive visualizations running on standard PCs;
    •      VisIVO Server, a grid-enabled high performance visu-       III.    V IS IVO S CIENCE G ATEWAY AND V IS IVO M OBILE
           alization platform, and                                                            A PPLICATION

    •      VisIVO Library [11] developed specifically to port           The existing VisIVO Web [12] has been integrated within
           VisIVO Server on gLite middleware5 .                     the WS-PGRADE/gUSE generic gateway [13] to offer new,
                                                                    easily accessible opportunities not only to scientific users, e.g.
    Users of each realization can obtain meaningful visual-         astrophysical researchers, but also to the wider public, e.g.
izations rapidly while preserving full and intuitive control        high-school education or innovative citizen science activities.
of relevant visualization parameters. This section focuses on       This work is supported by the SCI-BUS project8 providing
VisIVO Server6 which can be installed on any web server             operation and maintenance of the gateway as well as end-
with a database repository and contains the following distinct      users support for training activities. A special focus of the
modules: VisIVO Importer, VisIVO Filters and VisIVO Viewer          work has been placed on standardization and quality control
(see Figure 1).                                                     issues in order to increase the chances of adoption (by other
                                                                    relevant user communities) of the developed technologies and
    VisIVO Importer converts user-supplied datasets into Vi-        methodologies.
sIVO Binary Tables (VBTs). A VBT is a highly-efficient data
representation realized through a header file containing all        A. VisIVO Science Gateway Main Services
necessary metadata and a raw data file storing actual data
values. VisIVO Importer supports conversion from several                The VisIVO Science Gateway is designed as a workflow
popular formats such as: ASCII and CSV, VOTables or FITS            enabled grid portal that is wrapped around WS-PGRADE
Tables without imposing any limits on sizes or dimensionality.      providing visualization and data management services to the
VisIVO Filters is a collection of data processing modules           scientific community by means of an easy-to-use graphical
to modify a VBT or to create a new VBT from existing                environment for accessing the full functionality of VisIVO
VBTs. The filters support a range of operations such as             Server. Complex workflows can be created and executed on
scalar distribution, mathematical operations or selections of       a variety of infrastructures (e.g. clouds, desktop and service
regions. VisIVO Viewer is the visualization core component          grids or supercomputers) to obtain comprehensive exploration
based on the Visualization ToolKit7 . It creates 3D images          and analysis of large-scale astrophysical datasets. The gateway
from multi-dimensional datasets rendering points, volumes and       offers role-based authorization modules and supports secure
isosurfaces. Moreover there is support for customized look up       login.
tables and visualizations using a variety of glyphs, such as            Currently a number of main roles are implemented for
cubes, spheres or cones. VisIVO Viewer can be also used to          access as follows: guests, standard and advanced users and
produce images in a given sequence of azimuth, elevation, and       administrators [4]. Standard users can upload and manage
zooming values that can be externally mounted to produce            their datasets through portlets without any knowledge about
movies.                                                             the (conveniently hidden) underlying grid-infrastructure and
    To create customized renderings from astrophysical data         middleware. By using interactive widgets users can construct
tables VisIVO Importer is first utilized to convert user datasets   customized renderings, or store data analysis and visualization
into VBTs. Then, one or more VisIVO Filters can be applied          results for future reference. Their datasets are managed inter-
to process these datasets, and finally VisIVO Viewer is invoked     nally through a relational database preserving their metadata
to display these renderings. Figure 1 illustrates the typical se-   and maintaining data consistency. Figure 2 shows the main
quence of steps required within the VisIVO Server processing        portlets of the Gateway connecting to VisIVO Importer, Filters
pipeline.                                                           and Viewer services.
                                                                        Both remote and local datasets can be uploaded - i.e.
  5 http://glite.cern.ch                                            residing on a remote URL or locally on a user’s PC. For
  6 http://sourceforge.net/projects/visivoserver
  7 http://www.vtk.org                                                8 http://www.sci-bus.eu
Fig. 2.   Main VisIVO Gateway portlets.                             VisIVO Gateway automatically displays all applicable VisIVO
                                                                    Filter operations allowing input of the relevant parameters.
                                                                    Finally the VisIVO Viewer is employed for image display. A
                                                                    right click on any processed dataset in the Data Management
                                                                    portlet is used in conjunction with the View button to cre-
                                                                    ate user-prescribed VisIVO Viewer views. VisIVO Gateway
                                                                    further allows users to generate scientific movies. These can
                                                                    be useful not only to scientists to present and communicate
                                                                    their research results, but also to museums and science centres
                                                                    to introduce complex scientific concepts to general public
                                                                    audiences.
                                                                        Users can create a Panoramic Movie by moving a camera
                                                                    along a motion path of 360o in azimuth and +/- 90o in
                                                                    elevation within the dataset’s domain. Customized Movies can
                                                                    be produced by intermediate snapshots specified as camera
                                                                    positions/orientations and the gateway generates a movie with
                                                                    a camera path containing these specified positions/orientations.
                                                                    Dynamic Movies can be created by interpolating several steps
                                                                    of a time evolution of a cosmological dataset. The user can
                                                                    browse a cosmological time evolution and choose two or more
                                                                    coherent datasets. The designed workflow will then produce
remote files the user must specify URL and optionally a             the necessary number of intermediate VBTs by calculating
user name and password for authentication. Depending upon           particle positions and applying boundary conditions as nec-
the size of the datasets under consideration, remote uploads        essary. This approach can be very useful, e.g. in revealing
could last a long period. To resolve this situation VisIVO          galaxy formation or observing large-scale structures such as
Gateway allows an off-line mode by means of a workflow              galaxy clusters.
submission so that users can issue upload commands and then             The creation of a movie represents a significant challenge
simply close their current session - a follow up e-mail typically   for the underlying computational resources as often hundreds
gives notification once the uploading operation is completed.       or thousands of high quality images must be produced. For this
The workflow employed for remote importing is illustrated in        reason Parameter Sweep (PS) workflows [14] are employed.
Figure 3. It allows generation of significant information for       This is particularly relevant to the visualization-oriented work-
meta data exploration, e.g. statistics on data values, histogram    flows presented in Section V. As the respective communities
calculation and plotting or a sample extraction of uploaded         typically employ a large number of parameters that have
datasets. Such meta data is available through the Properties        to be varied within user-defined ranges, several hundreds to
portlet and some can be modified by the user (e.g. renaming         thousands of workflow executions might be necessary. As an
VBTs or related fields).                                            example a panoramic movie is generated with the workflow
                                                                    shown in Figure 4, it generates four movies with different
Fig. 3.   Remote VisIVO Importer Workflow.                          camera position paths on the generator port: from 0o to 360o
                                                                    azimuth rotation, from 0o to 90o elevation rotation, from 90o
                                                                    to −90o elevation rotation and from −90o to 0o elevation
                                                                    rotation. The generation of these four movies is executed in
                                                                    parallel and is finally merged through a collector port as shown
                                                                    in Fig. 4.

                                                                    Fig. 4.   Panoramic Movie Workflow.




                                                                    B. VisIVO Mobile Application
    Once the data file is uploaded a sequence of simple                The VisIVO Mobile application (see Fig. 5) allows smart-
actions is required to rapidly obtain meaningful visualizations.    phone devices to exploit VisIVO Gateway functionalities to
Typically various VisIVO Filter operations are performed, and       access large-scale astrophysical datasets residing on a server
repository for analysis and visual discovery. Through interac-                C. Implementation Details and Computing Infrastructures
tive widgets, customized visualizations (images or movies) can
be generated and stored on the remote server. The application                     The VisIVO Science Gateway is based on the collaborative
notifies users when requested visualizations are available for                and community oriented application development environment
retrieving on their smartphones and allows sharing of data,                   WS-PGRADE/gUSE. There is full integration in the portal
images and movies via e-mail or by exploiting common social                   framework Liferay which is highly customizable thanks to the
networks.                                                                     adoption of portlet technology defined in the Java Specifica-
                                                                              tion Request 168 and 28611 , and compatible to modern web
                                                                              applications. The implemented portlets are developed with the
Fig. 5. VisIVO Mobile screenshots on an iPad device: navigation through the
imported datasets and produced images and scientific movies (upper figure);
                                                                              Java Vaadin web Framework12 . This open source framework
and dataset remote importing (lower figure).                                  has been employed to implement server side Java Servlet based
                                                                              web applications using the full power and flexibility of Java
                                                                              without taking care of the client side since it compiles the
                                                                              Java source code to JavaScript which can then be run on any
                                                                              browser.
                                                                                  The current architecture of VisIVO Science Gateway has
                                                                              a distributed configuration on different machines enhancing
                                                                              the service performances as shown in Figure 6. The front-end
                                                                              services contain WS-PGRADE and Liferay and the back-end
                                                                              services include the gUSE components. The database server
                                                                              resides on the back-end machine. The VisIVO community
                                                                              of advanced users are enabled to create, change, invoke, and
                                                                              monitor workflows accessing to all of the components of WS-
                                                                              PGRADE/gUSE, while standard users are provided with the
                                                                              easy-to-use specific web based user interfaces described in
                                                                              Section III-A including the gUSE Application Specific Module
                                                                              (ASM) API [15] to reuse the implemented workflows stored in
                                                                              the local repository of gUSE. The VisIVO Mobile application
                                                                              configures and submits workflows residing on the VisIVO
                                                                              Gateway by means of the gUSE Remote API as described
                                                                              in section III-B.

                                                                              Fig. 6.   VisIVO Gateway Architecture.




    The current version of VisIVO Mobile is implemented
in Objective-C optimized for the Apple iPhone, iPod and
iPad, and, in the near future, it will be ported to other
popular smartphone devices. End users can login with the same
credentials as on the gateway and the application provides the
password coding in SHA cryptography exploiting the built-in
functionalities of the Liferay9 environment and querying the
remote database to verify access credentials. The configuration
and submission of workflows residing on the VisIVO Gateway
is performed by means of the gUSE Remote API [13]. This
API interfaces to the core gUSE services without the WS-
PGRADE user interface component. Thus running and manag-
ing scientific workflows is realized by command line solutions
consisting of curl10 based access wrapped in shell scripts. The
API exposes usage of gUSE components through a simple web                       The VisIVO Science Gateway currently exploits the
service interface, resulting in wide adaptability by a diverse set            Cometa Consortium grid13 . This infrastructure is distributed in
of tools and programming languages.
                                                                                11 http://jcp.org/en/jsr
  9 http://www.liferay.com                                                      12 http://www.vaadin.com
  10 http://curl.haxx.se/                                                       13 http://www.consorzio-cometa.it
seven sites of Sicily. All sites have the same hardware and soft-       and it can be extended with other engines on demand. Such
ware configuration allowing high interoperability and realizing         extensions translate between workflow languages and facilitate
an homogeneous environment. The computing infrastructure is             the nesting of workflows into larger workflows even when
based on IBM Blade Centre each containing up to 14 IBM                  those are written in different languages and require different
LS21 blades interconnected with the low latency Infiniband-             interpreters for execution. This functionality can enable scien-
4X network, to provide High Performance Computing (HPC)                 tific collaborations to share and offer workflows for reuse and
functionalities on the grid. There are currently about 2000             execution. Shared workflows can be executed on-line, without
CPU cores and more than 200 TBs of disk storage space                   installing any special client environment for downloading
available on this HPC e-Infrastructure. As reported in [4]              workflows.
the VisIVO Science Gateway is undergoing testing under the
ETICS system [16] based on the Metronome software [17] by                                V.    S UPPORTING C OMMUNITIES
4D Soft14 . Web testing has been adopted by 4D Soft mainly
because it is platform and application independent for testing              A number of challenging workflows has been prototyped
in different environments and supports different technologies           recently to support highly specialised scientific communities
in a uniform way through test libraries. Currently a number             mainly in astrophysics. This section discusses our experiences
of tests is under development suitable for the VisIVO Mobile            with the visualisation-oriented workflows Muon Portal and
application.                                                            LasMOG, and the simulation-oriented workflow FRANEC. The
                                                                        former are deployed for detecting nuclear threat materials (see
                     IV.       S HARING W ORKFLOWS                      V-A) and investigating large-scale modified gravity models
                                                                        (see V-B) respectively. The latter is exploited for carrying out
    Building large workflows from scratch to address scientific         stellar evolution simulations. These workflows will be sup-
communities can be time-consuming, as it is inherently a                ported in ER-flow18 so that they can be stored into the SHIWA
multi-disciplinary process. As an example, although astro-              workflow repository together with related meta-data, allowing
physicists might be able to appreciate the benefit to their             investigation of their interoperability and dissemination across
work in using a workflow, they are less interested in the               relevant communities through the SHIWA simulation platform.
technical details for developing it, this is a task that is naturally
associated with the developers (typically computer scientists).             Advanced users can exploit such workflows as templates
Manually monitoring the evolving structure of workflows, e.g.           for building new customized workflows to suit particular
by email or written documentation, can be quite challenging.            requirements of scientific communities, e.g. by modifying
The plan is then to not only educate non computer science               appropriately constituent building blocks customized LasMOG
scientific communities in using workflows, but to also provide          workflows can be generated. Standard users can then execute
them with high level tools so that they can access the results          these workflows in an interactive and user-friendly way by
of these workflows intuitively. Effective collaboration requires        means of the supplied portlets. Any user can submit jobs to
ways to facilitate exchange between different groups, in partic-        the underlying DCIs without requiring a priori any specific
ular enabling sharing and realizing re-use and interoperability.        technical expertise related to the particulars of the DCI con-
The SHIWA project15 (SHaring Interoperable Workflows for                figuration.
large-scale scientific simulations on Available DCIs) provided              We are currently in the planning stages of developing a
solutions to facilitate sharing and exchanging of workflows             number of new visualisation-oriented workflows to be de-
between workflow systems and DCI resources through the                  ployed for rapid discovery of supernova light curve anoma-
SHIWA Simulation Platform (SSP) consisting of:                          lies19 and validation of models reconstructing the large scale
                                                                        structure of the universe2021 . Furthermore two simulation-
    •     SHIWA Repository16 : A database where workflows
                                                                        oriented workflows are under development, the first one will
          and meta-data about workflows can be stored. The
                                                                        be deployed for studying trajectories of interstellar comets
          database is a central repository for users to discover
                                                                        passing through the Solar System and the second one will be
          and share workflows within and across their commu-
                                                                        focused on modelling the dynamical evolution of meteoroid
          nities.
                                                                        streams. The vision is that, once a sufficient number of
    •     SHIWA Portal17 : A web portal that is integrated              visualisation-oriented and simulation-oriented workflows has
          with the SHIWA Repository and includes a workflow             been developed, to analyse any similarities in depth towards
          executor engine that can orchestrate various types of         developing templates for generating classes of workflows to
          workflows on a number of computational grid/cloud             address the needs of specialized scientific communities. The
          platforms.                                                    remaining of this section focuses on the Muon Portal, LasMOG
                                                                        and FRANEC workflows.
    Through the SHIWA Portal one can define and run sim-
ulations on the SHIWA Virtual Organisation which is an e-               A. Muon Portal
infrastructure that gathers computing and data resources from
various DCIs, including the European Grid Infrastructure. The              The deflection of muonic particles present in the secondary
portal (via third party workflow engines) provides support              cosmic radiation results from crossing high atomic number
for a number of commonly used academic workflow engines                 materials (such as uranium or other fissile materials). This can
  14 http://etics3.4dsoft.hu                                              18 http://www.erflow.eu
  15 http://www.shiwa-workflow.eu                                         19 http://supernovae.in2p3.fr/∼guy/salt
  16 http://shiwa-repo.cpc.wmin.ac.uk                                     20 http://www.mpa-garching.mpg.de/gadget
  17 http://shiwa-portal2.cpc.wmin.ac.uk/liferay-portal-6.1.0             21 https://github.com/cmcbride/bgc utils
significantly improve on the success rate of current nuclear                  modified gravity (i.e. without introducing dark energy) as an
threat detection methods which are based on X-ray scan-                       alternative to dark energy models [23].
ners [18], especially in terms of capacity for identification
and location of illicit materials inside cargo containers, even                   Observing the large scale structure of the universe could
considering the possibility of screens designed to mask their                 in principle provide new test of GR on cosmic scales. This
existence [19].                                                               kind of test cannot be done without the help of simulations as
                                                                              the structure formation process is highly non-linear. Large-
    We have developed a visualisation-oriented workflow suit-                 scale simulations are thus performed for modified gravity
able for inspection of cargo containers carrying high atomic                  models, e.g. from the Large Simulation for Modified Gravity
number materials, by displaying tomographic images [20].                      (LaSMoG) consortium.
Preliminary results of this workflow have been reported in [4].
The datasets containing coordinates of the muon tracker planes
                                                                              Fig. 8. LasMOG processing: portlet interface, workflow and selected results.
are first uploaded to our gateway and filtered by using the
Point of Closest Approach (POCA) algorithm [21] to create
a representation containing the scattering deflection of cosmic
radiations. The result is then visualized using point rendering.
    Further processing is then applied based on user-defined
thresholds, followed by conversion into data volumes using the
deflection angle field distribution by employing the 3D Cloud-
in-Cell (CIC) [22] smoothing algorithm. Finally, a tomography
is performed for inspection. Figure 7 shows the most recent
development and results of the entire computational process
starting from: a) parameter setting through the supplied portlet,
then b) submitting the implemented workflow, and finally c)
outputting resulting images obtained using isosurface render-
ing for the filtered (top image) and raw (bottom image) datasets
respectively.

Fig. 7.    Muon Portal processing: portlet interface, workflow and selected
results.




                                                                                  The workflow shown in Figure 8 implements a customised
                                                                              visualization for aiding analysis of modified GR simulations,
                                                                              more specifically inspecting datasets to discover anomalies by
                                                                              comparing appropriately with datasets coming from standard
                                                                              GR models. The main computational steps are summarised as
                                                                              follows:

                                                                                  •     Two datasets corresponding to snapshots of standard
                                                                                        gravity (DS ) and modified gravity (DM ) model sim-
                                                                                        ulations are processed.
                                                                                  •     Sub-samples of the point distributions with a reduced
                                                                                        number of points in the two datasets are generated.
                                                                                        Then, for each of these sub-samples a panoramic
                                                                                        movie is created (as shown in the resulting top image
                                                                                        of Figure 8).
                                                                                  •     A point distribute operation is performed on DS and
                                                                                        DM to create new volume datasets (VS and VM
                                                                                        respectively) using a field distribution algorithm on
                                                                                        a regular mesh.
B. LasMOG                                                                         •     A volume property on the same computational domain
                                                                                        is distributed on a regular mesh producing a density
    The acceleration of the Universe is one of the most chal-                           field.
lenging problems in cosmology. In the framework of general
relativity (GR), the acceleration originates from dark energy.                    •     A new volume V∆ is computed where each of its
However, to explain the current acceleration of the Universe,                           voxels shows a difference of values in the density
the required value of dark energy must be incredibly small.                             between VS and VM . It is then filtered with a lower
Recently efforts have been made to construct models for                                 bound threshold and all the voxels satisfying the filters
        are saved in a text file for further analysis purposes        Fig. 9. FRANEC processing: portlet interface, workflow and selected results.
        (as shown in the resulting bottom image of Figure 8).
   •    Several renderings of V∆ are performed:
          ◦ Volume rendering;
          ◦ Isosurface rendering of the density field to
               produce panoramic movies using different iso-
               values (as shown in the resulting bottom image
               of Figure 8);
          ◦ Ortho-slice rendering i.e. orthogonal slice
               planes through the volume dataset.


C. FRANEC
    FRANEC is a state-of-the-art [24] numerical code for
stellar astrophysics. This code is perfectly suited for computing
evolutions of stars on the basis of a number of different physi-
cal inputs and parameters. A single run of FRANEC produces
one synthetic model (SM). To produce an isochrone, for a
given chemical composition, through a FIR (Full Isochrone
Run), it is necessary to execute a large number of SMRs (SM
runs) varying the initial mass of the stellar models. Once these
evolutionary tracks and isochrones (and other additional data)
are computed, they can be distributed in datasets over different
sites.
    The simulations of stellar models produce simulation out-
put files with a set of associated metadata. Such metadata are
linked to all parameters concerning the numerical evolutionary
code. In this way it is possible to store and easily search and re-       4)     Output Post-Processing module consists of the fol-
trieve the obtained data by many sets of stellar simulations, and                lowing jobs:
furthermore get access to a large amount of homogeneous data                       • TAR produces a compressed archive of the
such as tracks and isochrones computed by using FRANEC.                                 main outputs.
The FRANEC workflow (see Figure 9) has a modular architec-                         • GNUPLOT produces the output plots (e.g. the
ture making it easy to identify reusable modules for building                           ones included in Figure 9).
other workflows. Modules can be differentiated on the basis
of their functionality:
                                                                                              VI.     C ONCLUSIONS
   1)    EOS Computation module provides the Equation of
         State in tabular form. The input values are the Metal-           Traditionally the common practice among astronomers for
         licity Z and the type of mixture (combination of             data exploration tools was to employ small, individually cre-
         chemical elements heavier than helium).                      ated and executed applications. This scenario is not applicable
   2)    OPACITY Computation module produces a table of               to modern large-scale datasets. Modular web applications for
         Opacity from pre-calculated tables. Given the Metal-         data analysis and visual discovery making effective usage of
         licity value Z and the type of mixture it obtains a          modern e-infrastructures can be instrumental in reaching out
         new table of opacity which is interpolated from the          astrophysical communities and aiding them in new scientific
         pre-calculated ones.                                         discoveries.
   3)    FRANEC is the core module of the workflow. It                    A workflow-oriented gateway allows scientists to share
         produces the models of stellar evolution starting from       their analysis workflows and identify best practices for inves-
         the output of the two modules EOS and OPACITY                tigating their datasets. More importantly, they can automate
         and a set of input parameters given by the user to           workflows for repeated analysis with changed parameters,
         perform the evolution: the mass (in Solar Units) of          which in the past was a manual, slow and very error prone
         the structure, the mass fraction of the initial helium,      process. This way scientists can focus on core scientific
         the mass fraction of the heavy elements abundance,           discoveries rather than wasting time on data analysis on dealing
         the efficiency of superadibatic convection, the mass         with inadequate resources.
         loss , the core convective overshooting during the H-
         burning phase , the diffusion index and the evolu-               VisIVO Gateway provides a web based portal for setting
         tionary stage index . It produces a set of parameter         up, running and evaluating visualizations in astrophysics for
         values varying in relation to time, quantities varying       large-scale datasets exploiting DCIs resources. The gateway
         in relation to the radius of the model, the chemical         includes a data repository containing images and movies
         composition of the core (vs. time), surface chemicals        produced from imported datasets, as well as repositories of
         (vs. time), and energy resolution flows(vs. time).           fundamental workflows, which can be used as templates for
generating new workflows to be distributed by the users of the                        [8]   G. Song, Y. Zheng, and H. Shen, “Paraview-based collaborative visu-
system.                                                                                     alization for the grid,” Advanced Web and Network Technologies, and
                                                                                            Applications, pp. 819–826, 2006.
    We presented several portlets running in a Liferay portal                         [9]   M. Hereld, E. Olson, M. Papka, and T. Uram, “Streaming visualization
environment together with a mobile application making the                                   for collaborative environments,” in Journal of Physics: Conference
gateway accessible from modern mobile platforms. For a                                      Series, vol. 125, no. 1. IOP Publishing, 2008.
number of specialised astrophysical communities we have                              [10]   M. Comparato, U. Becciani, A. Costa, B. Larsson, B. Garilli, C. Gheller,
                                                                                            and J. Taylor, “Visualization, exploration, and data analysis of complex
discussed workflows and the issues involved in developing                                   astrophysical data,” Publications of the Astronomical Society of the
them. The modularity achieved by subdividing workflows into                                 Pacific, vol. 119, no. 858, pp. 898–913, 2007.
a number of core tasks ensures re-usability and provides high                        [11]   U. Becciani, A. Costa, N. Ersotelos, M. Krokos, P. Massimino, C. Petta,
flexibility. End users do not need to be aware of set-up options                            and F. Vitello, “Visivo: A library and integrated tools for large astro-
or be aware of the computing infrastructure operating behind                                physical dataset exploration,” in Astronomical Data Analysis Software
the scenes.                                                                                 and Systems XXI, vol. 461, 2012, p. 505.
                                                                                     [12]   A. Costa, U. Becciani, P. Massimino, M. Krokos, G. Caniglia,
    We envisage building a specialized repository of astro-                                 C. Gheller, A. Grillo, and F. Vitello, “Visivoweb: a www environment
physics workflows core modules to share them among com-                                     for large-scale astrophysical visualization,” Publications of the Astro-
munities using the SHIWA platform. Our vision for these is                                  nomical Society of the Pacific, vol. 123, no. 902, pp. 503–513, 2011.
to be used not only by astrophysical communities but to also                         [13]   P. Kacsuk, Z. Farkas, M. Kozlovszky, G. Hermann, A. Balasko,
                                                                                            K. Karoczkai, and I. Marton, “Ws-pgrade/guse generic dci gateway
be potentially exploited within other scientific contexts. This                             framework for a large variety of user communities,” Journal of Grid
activity will also be instrumental in future work for creating an                           Computing, vol. 10, no. 4, pp. 601–630, 2012.
Astro-Gateway Federation establishing a network of Science                           [14]   P. Kacsuk, K. Karoczkai, G. Hermann, G. Sipos, and J. Kovacs, “WS-
Gateways to benefit astrophysical communities sharing tools                                 PGRADE: Supporting parameter sweep applications in workflows,” in
and services, data, repositories, workflows and computing                                   Workflows in Support of Large-Scale Science, 2008. WORKS 2008.
infrastructures.                                                                            Third Workshop on. Ieee, 2008, pp. 1–10.
                                                                                     [15]   A. Balasko, M. Kozlovszky, A. Schnautigel, K. Karóckai, I. Márton,
                                                                                            T. Strodl, and P. Kacsuk, “Converting p-grade grid portal into e-science
                          ACKNOWLEDGMENT                                                    gateways,” International Workshop on Science Gateways, pp. 1–6, 2010.
                                                                                     [16]   A. Meglio, M. Bégin, P. Couvares, E. Ronchieri, and E. Takacs, “Etics:
   The research leading to these results has received funding                               the international software engineering service for the grid,” in Journal
from the European Commission’s Seventh Framework Pro-                                       of Physics: Conference Series, vol. 119. IOP Publishing, 2008, p.
gramme (FP7/2007-2013) under grant agreement no 283481                                      042010.
SCI-BUS (SCIentific gateway Based User Support) and the                              [17]   A. Pavlo, P. Couvares, R. Gietzel, A. Karp, I. Alderman, M. Livny, and
FP7 project under contract no 312579 ER-flow (Building                                      C. Bacon, “The NMI build & test laboratory: Continuous integration
                                                                                            framework for distributed computing software,” in The 20th USENIX
an European Research Community through Interoperable                                        Large Installation System Administration Conference (LISA), 2006, pp.
Workflows and Data).                                                                        263–273.
                                                                                     [18]   J. Katz, G. Blanpied, K. Borozdin, and C. Morris, “X-radiography of
                               R EFERENCES                                                  cargo containers,” Science and Global Security, vol. 15, no. 1, pp. 49–
                                                                                            56, 2007.
 [1]   A. Hassan and C. Fluke, “Scientific visualization in astronomy: Towards       [19]   S. Riggi, V. Antonuccio, M. Bandieramonte, U. Becciani, F. Belluomo,
       the petascale astronomy era,” Publications of the Astronomical Society               M. Belluso, S. Billotta, G. Bonanno, B. Carbone, A. Costa et al.,
       of Australia, vol. 28, no. 2, pp. 150–170, 2011.                                     “A large area cosmic ray detector for the inspection of hidden high-z
 [2]   M. Borkin, S. Offner, E. Lee, H. Arce, and A. Goodman, “Visualiza-                   materials inside containers,” in Journal of Physics: Conference Series,
       tion and analysis of synthetic observations of embedded protostellar                 vol. 409, no. 1. IOP Publishing, 2013, p. 012046.
       outflows,” in Bulletin of the American Astronomical Society, vol. 43,         [20]   M. Bandieramonte, “Muon tomography: tracks reconstruction and vi-
       2011, p. 25813.                                                                      sualization techniques,” Nuovo Cimento C - Colloquia and Communi-
 [3]   A. Belloum, M. Inda, D. Vasunin, V. Korkhov, Z. Zhao, H. Rauwerda,                   cations in Physics, to appear.
       T. Breit, M. Bubak, and L. Hertzberger, “Collaborative e-science ex-          [21]   D. Sunday, “Distance between lines and segments with their closest
       periments and scientific workflows,” Internet Computing, IEEE, vol. 15,              point of approach,” 2004. [Online]. Available: http://softsurfer.com/
       no. 4, pp. 39–47, 2011.                                                              Archive/algorithm 0106/algorithm 0106.htm
 [4]   E. Sciacca, M. Bandieramonte, U. Becciani, A. Costa, M. Krokos,               [22]   R. Hockney and J. Eastwood, Computer simulation using particles.
       P. Massimino, C. Petta, C. Pistagna, S. Riggi, and F. Vitello, “Visivo               Taylor & Francis, 1992.
       workflow-oriented science gateway for astrophysical visualization,” in
                                                                                     [23]   G.-B. Zhao, B. Li, and K. Koyama, “N-body simulations for f (r) gravity
       21st Euromicro International Conference on Parallel, Distributed and
                                                                                            using a self-adaptive particle-mesh code,” Physical Review D, vol. 83,
       Network-Based Computing (PDP’13). IEEE Computer Society Press,
                                                                                            no. 4, p. 044007, 2011.
       2013.
                                                                                     [24]   A. Pietrinferni, S. Cassisi, M. Salaris, and F. Castelli, “A large stellar
 [5]   P. Kacsuk, “P-grade portal family for grid infrastructures,” Concurrency
                                                                                            evolution database for population synthesis studies. i. scaled solar
       and Computation: Practice and Experience, vol. 23, no. 3, pp. 235–245,
                                                                                            models and isochrones,” The Astrophysical Journal, vol. 612, no. 1,
       2011.
                                                                                            p. 168, 2008.
 [6]   U. Becciani, A. Costa, V. Antonuccio-Delogu, G. Caniglia, M. Com-
       parato, C. Gheller, Z. Jin, M. Krokos, and P. Massimino, “Visivo–
       integrated tools and services for large-scale astrophysical visualization,”
       Publications of the Astronomical Society of the Pacific, vol. 122, no.
       887, pp. 119–130, 2010.
 [7]   M. Riedel, W. Frings, S. Dominiczak, T. Eickermann, D. Mallmann,
       P. Gibbon, and T. Dussel, “Visit/gs: Higher level grid services for
       scientific collaborative online visualization and steering in unicore
       grids,” in Parallel and Distributed Computing, 2007. ISPDC’07. Sixth
       International Symposium on. IEEE, 2007.