Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices Lorenz Kromer, Markus Wagner, Kerstin Blumenstein, Alexander Rind, Wolfgang Aigner St. Poelten University of Applied Sciences, Austria task-driven activities and more on curiosity and enjoy- ment while exploring personally relevant data. Show- Abstract ing trends of popular baby names, the Name Voyager [Wat05] is a typical example of a casual visualization. Modern data visualizations are developed as A main challenge faced by the developers of ca- interactive and intuitive graphic applications. sual visualization systems is the heterogeneity of de- In the development process, programmers ba- vices and platforms they should support. In particu- sically pursue the same goal: creating an lar for the casual context, mobile phones and tablets application with a great performance. Such are more suitable than classical desktop computers applications have to display information at [BWA15a, BNW+ 16, HTA+ 15, LAMR14]. Native its best way in every possible situation. In systems, e.g., apps for Android or Apple, are only this paper, we present a performance com- runnable on the platform for which the code is com- parison on mobile devices between D3.js and piled for. Cross-platform support requires the develop- Unity based on a Baby Name Explorer exam- ment on top of different software stacks and to main- ple. The results of the performance analysis tain separate code bases. One approach to address this demonstrated that Unity and D3.js are great challenge are web-based visualizations, i.e. using web tools for information visualization. While technology such as D3.js [BOH11] within the browser. Unity convinced by its performance results ac- However, a wide-spread concern is that web-based sys- cording to our test criteria, currently Unity tems lack performance. For example, Baur stated in a does not provide a visualization library. 2013 interview [BSB13] that for big visualization sys- tems such as TouchWave [BLC12], going native can- not be avoided because “in the web it looks like a slide 1 Introduction & Related Work show”. Besides the negative effects of interactive la- tency [LH14], performance overheads negatively affect Visualization systems provide interactive, visual rep- battery load of mobile devices. Alternative approaches resentations of data [CMS99] designed to help people are cross-compilers such as Unity [uni16], which can understand complex phenomena and augment their deploy a single code base to native systems for mul- decision-making capabilities [Mun14]. Given the in- tiple platforms. Yet, a limitation of Unity is that it terconnectedness of the current age and the increasing does not include a software library for visualizing data volumes of collected data, there is a dire need for such [WBR+ 16]. These two approaches for cross-platform support. While many usage scenarios can be identified visualization work very differently during both imple- in scientific research and business management, sys- mentation and runtime. The choice will largely depend tems for personal visualization [HTA+ 15] and casual on the respective application scenario but empirical information visualization [PSM07] serve exceptionally data on their performance is needed to inform such a broad audiences. These visualizations focus less on decision. While some research has been carried out to com- Copyright •c by the paper’s authors. Copying permitted for private and academic purposes. pare the performance of different web-based visualiza- In: W. Aigner, G. Schmiedl, K. Blumenstein, M. Zeppelzauer tion technologies [LAB+ 08, JJK08, KSC12], no stud- (eds.): Proceedings of the 9th Forum Media Technology 2016, ies have been found which compare the performance of St. Pölten, Austria, 24-11-2016, published at http://ceur-ws.org web-based and cross-compiled visualization approach. 47 Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices (a) Unity (b) D3.js Figure 1: Shows a screenshot of the Baby Name Explorer interface implemented with (a) Unity and (b) D3.js. Shows the circle packing chart (left) with the corresponding grouped bar chart (right) representing the frequency per year for male (blue) and female (pink) names. Neither could we identify performance results obtained cle packing chart is linked with a grouped bar chart. from different target platforms. The bar chart initially shows the number of babies Thus, the paper at hand, contributes a performance for all names grouped per year, split into female and comparison between Unity (cross-compiled to native) male names (using the same colors as for the bubbles). and D3.js (web-based) on four mobile devices. For When selecting a first letter bubble, the grouped bar this, we created two implementations of a casual vi- chart shows the number of babies for names starting sualization system to explore popular baby names as with the selected letter. When selecting a name bub- described in Section 2. Section 3 covers the implemen- ble (e.g., “Leonie”), the grouped bar chart changes to tation details and test setup. After the test results in a single bar chart presenting the number for the name Section 4, we conclude our work in Section 5 and out- per year. line future work. 3 Implementation and Test Setup 2 Visualization Design To introduce the implementation and test setup, we As proof of concept we started with implementing a describe the used tools for implementation D3.js and simple interactive visualization setup using an open Unity, the four test devices and environments, the per- data set of the regional government of Upper Austria formance criteria and desired results as well as the on the 50 most often used male and female baby names measured values and methods. from 2004 to 2013. The dataset includes the variables name (nominal), gender (categorical), year (quanti- 3.1 Test Devices and Environments tative) and count (quantitative). All these data are merged together into a table provided as *.csv file. As Since we focus on cross-platform visualization, the visualization concepts we combined a circle packing test devices cover a range from tablets (Nexus 9 and chart [HBO10] with grouped bar charts [CM84]. iPad Air) to Smartphones (iPhone 6S+ and Galaxy S6 Initially, the circle packing chart shows the first let- Edge). Both visualization systems are investigated on ters of the baby names as bubbles and its diameter the devices shown in Table 1. matches with the number of babies per year. A slider When selecting the mobile test devices, we deliber- is positioned at the bottom of the screen for selecting ately choose devices with larger screen sizes, since the the year to display. presentation of the tested visualization (see Section 2) By tapping a bubble, the bubble expands and the on a screen size of 5” or small is not optimal. names which are related to the first letter are shown The visualization is tested under Android 5.1 inside the big bubble (see Figure 1). The color of a (Nexus 9 and Galaxy S6 Edge) and iOS 9 (iPad Air name bubble is related to the gender (pink := female, and iPhone 6S+). In addition to the requirements of blue := male) and the diameter matches the number the devices, the test concept of this paper also exam- of babies with the name for the selected year. Dur- ines the dependencies of both visualization versions ing the layout phase, the bubbles are placed using of external components such as libraries and plug-ins, physics-based movement like gravity and the biggest which were used during the development process. bubble is set to the center of the screen. The cir- Unity: With the development environment of 48 Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices Table 1: Overview of the dimensions of the test devices. Screen Screen Device Type Processor RAM Graphics processor size resolution Nexus 9 Tablet 8.9” 2048 ◊ 1536px NVIDIA Tegra K1 2 GB NVIDIA GeForce ULP iPad Air LTE Tablet 9.7” 2048 ◊ 1536px Apple A7 1 GB PowerVR G6430 Smart- iPhone 6S+ 5.5” 1920 ◊ 1080px Apple A9 2 GB PowerVR GT7600 phone Galaxy Smart- Samsung Exynos 7 5.1” 2560 ◊ 1440px 3 GB Mali-T760 MP8 S6 Edge phone Octa 7420 Unity it is possible to make a project accessible for With the aforementioned measured values, both vi- multiple platforms. The Unity version of the Baby sualization systems were tested in a specific user sce- Name Explorer (Figure 1a) is exported in two versions nario. In this case, the Baby Name Explorers usage (Android and iOS). The rich development environ- was simulated over 60 seconds by a regular interac- ment of the game engine Unity includes a sufficient tion with the respective system. To reduce the effects repertoire of physics components and 3D elements. of operating system and other processes beyond user Therefore, we did not have to use external libraries. control, this user scenario was repeated five times on D3.js: Since the implementation of the visualiza- each visualization system per tested device. tion in D3.js (Figure 1b) is web browser based, we used the Google Chrome web browser as test envi- 4 Results ronment which is available on all tested devices (see Table 1). Thus, the visualization is represented under The results of the performance comparison of both ver- the same technological conditions. For the implemen- sions are separated into the three measured parame- tation of the web based version, we did not need addi- ters, which were presented before. All the measured tional JavaScript libraries, because D3.js contains all values of the different test devices were compared into functionalities. an Excel sheet for preprocessing. By using MS Ex- cel, we processed the calculation of the median values to eliminate outliers and exported the result for each 3.2 Measured Values and Methods parameter as grouped bar chart. To compare a number of software applications, com- mon metrics and measurement points have to be de- 4.1 CPU Usage Analysis fined [MFB+ 07]. Subsequently the used methods are: Based on the performed measurements, Unity gener- • FPS: For measuring the frames per second (FPS) ates less CPU usage than D3.js. Calculating the me- rates, time logging functions are added around dian across all measured devices, Unity takes 22% and rendering methods in the code, logging the results D3.js takes 38%. Figure 2 illustrates a diagram to via logfiles or the console. compare the CPU usage between the tested devices in • CPU utilization: To show the difference be- idle mode and while performing both versions. tween the hardware components, the CPU uti- lization was observed while performing both visu- alizations in a specific scenario and five minutes in idle mode. Therefore, it was ensured that no other processes were running on the device. • Loading time of raw data: Both version (Unity and D3.js) contain an explicit function to load the raw data. In order to compare the raw data load- ing from a CSV file, the elapsed time was mea- sured between the explicit function call and end. In relation to the technical implementation, Unity and D3.js are strongly different. To overcome this is- sue, we recorded the system parameters and console logs with OS specific development systems, because Figure 2: CPU usage in % in Unity (green), D3.js (or- there are no uniform functions available to detect the ange) compared to idle mode (blue) [lower is better]. previously listed system parameters. 49 Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices During the performance analysis it was very inter- esting to see, that the Nexus 9 tablet got noticeable warmer than the other devices. This effect mirrors in the device’s CPU usage. However, no temperature measurements were carried out to investigate this ef- fect. In general, less CPU usage is a big benefit from the perspective of smart devices because less energy consumption results in more battery time. 4.2 FPS analysis The evaluation of the FPS data shows that Unity reaches a median of 57 FPS and D3.js version achieves a median of 51 FPS. Unity can be seen as the winner of Figure 4: CSV loading times in ms while performing this criteria of the performance comparison. The de- in Unity (green) and D3.js (orange) [lower is better]. tailed median values of the evaluation part are shown in Figure 3. compilation to native code and web technology, i.e. usage within a web browser. For this, the Baby Name Explorer, as example of a realistic casual visualization design, was implemented in both Unity and D3.js. Our experimental compari- son on four devices showed that FPS were comparable, D3.js was faster in initial data transformations, and Unity resulted in a lower CPU utilization. In terms of developer experience, Unity’s IDE sup- ports C# as well as JavaScript for development. The cross-compilation and deployment of the Baby Name Explorer for all tested platforms worked seamlessly. D3.js code is typically developed for a web environ- Figure 3: FPS rate while performing with Unity ment. Due to the variety of web browsers, web based (green) and D3.js (orange) [higher is better]. visualizations need to be tested on a wide selection before being released. During our experiment both implementations worked well. It is very prominent, that the FPS rate of the D3.js Depending on our proof-of-concept, we demon- version was pretty low on the Galaxy S6 Edge, despite strated the benefits of the use of Unity for informa- the fact that the CPU usage on this device also stayed tion visualization and cross-platform compilation in slightly. In contrast, the Nexus 9 tablet was the only our field of research. In the next steps we will focus device which reaches higher FPS with D3.js. on the synchronization for collaboration and semantic zoom [WBR+ 16] and to show the ability to use this 4.3 Loading Time Analysis framework for visualization for the masses as called by Blumenstein et al. [BWA+ 15b] as an easy to use The result of the CSV data loading time measurement system. shows, that D3.js takes a median of 5.17ms. In con- trast, Unity requires significantly more time for the Acknowledgements raw data loading which results in a median of 15.17ms. Figure 4 shows the gap between both versions. This work was supported by the Austrian Science Fund The measured time depends on the internal imple- (FWF) via the KAVA-Time and VisOnFire projects mentation of the loading methods of the visualizations (no. P25489 and P27975), the Austrian Ministry for which is the reason of the serious differences at the Transport, Innovation and Technology (BMVIT) un- cycle times of these functions. der the ICT of the future program via the VALiD project (no. 845598) and under the Austrian Security Research Programme KIRAS via the project Coura- 5 Conclusion geous Community (no. 850196) as well as the project This study compared two different approaches for seekoi (no. 1154) funded by the Internet Foundation implementing cross-platform visualizations: cross- Austria (IPA). 50 Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices References of the American Statistical Association, 79(387):531–554, 1984. [BLC12] Dominikus Baur, Bongshin Lee, and Sheelagh Carpendale. TouchWave: ki- [CMS99] Stuart K. Card, Jock D. Mackinlay, and netic multi-touch manipulation for hier- Ben Shneiderman. Readings in Infor- archical stacked graphs. In Proc. 2012 mation Visualisation. Using Vision to ACM int. conf. Interactive Tabletops and Think. Morgan Kaufmann, 1999. Surfaces, ITS, pages 255–264. ACM, 2012. [HBO10] Jeffrey Heer, Michael Bostock, and Vadim Ogievetsky. A tour through the [BNW+ 16] Kerstin Blumenstein, Christina Niederer, visualization zoo. Comm. of the ACM, Markus Wagner, Grischa Schmiedl, 53(6):59, 2010. Alexander Rind, and Wolfgang Aigner. Evaluating information visualization on [HTA+ 15] Dandan Huang, Melanie Tory, Bon Adriel mobile devices: Gaps and challenges in Aseniero, Lyn Bartram, Scott Bateman, the empirical evaluation design space. In Sheelagh Carpendale, Anthony Tang, Proc. 6th Workshop on Beyond Time and and Robert Woodbury. Personal visu- Errors on Novel Evaluation Methods for alization and personal visual analytics. Visualization, BELIV, pages 125–132. IEEE Trans. Vis. and Comp. Graphics, ACM, 2016. 21(3):420–433, March 2015. [JJK08] Donald W. Johnson and T. J. Jankun- [BOH11] Michael Bostock, Vadim Ogievetsky, and Kelly. A scalability study of web- Jeffrey Heer. D3: Data-Driven Docu- native information visualization. In Proc. ments. IEEE Trans. Vis. and Comp. Graphics Interface, GI, pages 163–168, Graphics, 17(12):2301–2309, December Toronto, 2008. Canadian Information 2011. Processing Society. [BSB13] Enrico Bertini, Moritz Stefaner, and [KSC12] Daniel E. Kee, Liz Salowitz, and Remco Dominikus Baur. Visualization on Chang. Comparing interactive web-based Mobile & Touch Devices. datas- visualization rendering techniques. In tori.es podcast, http://datastori.es/ Poster Proc. IEEE Conf. Information Vi- data-stories-25-mobile-touch-vis/, sualization, InfoVis, 2012. 00:41:49 to 00:46:08, July 2013. [LAB+ 08] Tim Lammarsch, Wolfgang Aigner, [BWA15a] Kerstin Blumenstein, Markus Wagner, Alessio Bertone, Silvia Miksch, Thomas and Wolfgang Aigner. Cross-Platform Turic, and Johannes Gärtner. A com- InfoVis Frameworks for Multiple Users, parison of programming platforms for Screens and Devices: Requirements and interactive visualization in web browser Challenges. In Workshop on Data Ex- based applications. In Proc. 12th Int. ploration for Interactive Surfaces DEXIS Conf. Information Visualisation, iV, 2015, pages 7–11, 2015. pages 194–199, July 2008. [BWA+ 15b] Kerstin Blumenstein, Markus Wagner, [LAMR14] Tim Lammarsch, Wolfgang Aigner, Sil- Wolfgang Aigner, Rosa von Suess, Har- via Miksch, and Alexander Rind. Show- ald Prochaska, Julia Püringer, Matthias ing important facts to a critical audi- Zeppelzauer, and Michael Sedlmair. In- ence by means beyond desktop comput- teractive Data Visualization for Second ing. In Yvonne Jansen, Petra Isen- Screen Applications: State of the Art berg, Jason Dykes, Sheelagh Carpen- and Technical Challenges. In Proc. of dale, and Dan Keefe, editors, Death of the Int. Summer School on Visual Com- the Desktop—Workshop co-located with puting, pages 35–48. Frauenhoferverlag, IEEE VIS 2014, 2014. 2015. [LH14] Zhicheng Liu and Jeffrey Heer. The ef- [CM84] William S. Cleveland and Robert McGill. fects of interactive latency on exploratory Graphical Perception: Theory, Experi- visual analysis. IEEE Trans. Vis. and mentation, and Application to the Devel- Comp. Graphics, 20(12):2122–2131, De- opment of Graphical Methods. Journal cember 2014. 51 Performance Comparison between Unity and D3.js for Cross-Platform Visualization on Mobile Devices [MFB+ 07] J. D. Meier, Carlos Farre, Prashant Ban- sode, Scott Barber, and Dennis Rea, ed- itors. Performance testing guidance for web applications: patterns & practices. Microsoft, United States?, 2007. OCLC: ocn245241921. [Mun14] Tamara Munzner. Visualization Analysis and Design. A K Peters Ltd, 2014. [PSM07] Zachary Pousman, John T. Stasko, and Michael Mateas. Casual Information Vi- sualization: Depictions of Data in Every- day Life. IEEE Trans. Vis. and Comp. Graphics, 13(6):1145–1152, 2007. [uni16] Unity – Game Engine, 2016. https://unity3d.com/. [Wat05] Martin Wattenberg. Baby names, visual- ization, and social data analysis. In Proc. IEEE Symp. Information Visualization, INFOVIS, pages 1–7, October 2005. [WBR+ 16] Markus Wagner, Kerstin Blumenstein, Alexander Rind, Markus Seidl, Grischa Schmiedl, Tim Lammarsch, and Wolf- gang Aigner. Native cross-platform visu- alization: A proof of concept based on the Unity3D game engine. In Proc. Int. Conf. Information Visualisation, iV, pages 39– 44. IEEE Computer Society Press, 2016. 52