=Paper= {{Paper |id=Vol-2066/isee2018paper08 |storemode=property |title=Continuous Publishing of Online Programming Assignments with INLOOP |pdfUrl=https://ceur-ws.org/Vol-2066/isee2018paper08.pdf |volume=Vol-2066 |authors=Martin Morgenstern,Birgit Demuth |dblpUrl=https://dblp.org/rec/conf/se/MorgensternD18 }} ==Continuous Publishing of Online Programming Assignments with INLOOP== https://ceur-ws.org/Vol-2066/isee2018paper08.pdf
                  Continuous Publishing of Online
              Programming Assignments with INLOOP
                                             Martin Morgenstern and Birgit Demuth
                                                 Technische Universität Dresden
                                        {martin.morgenstern1,birgit.demuth}@tu-dresden.de


   Abstract—We supplement our software engineering lectures                               III. F EATURES AND D ESIGN
and exercises with facultative online programming assignments.
Based on our experience with previous systems, we developed                In INLOOP, tasks are organized into categories and have
INLOOP, a solution tailored to our needs. The core of the              optional publication dates and deadlines, which can be used to
system is a test runner component that supports virtually any          restrict the user’s ability to view a task or submit solutions for
kind of solution assessment through a generic interface. The           it. The test runner of INLOOP supports virtually any kind of
system supports automated plagiarism checks and grading, which
we currently use to award students bonus points for written            solution assessment, i.e., any build, test or analysis tool that is
exams. In addition, it is directly integrated with a version control   available for Linux, in a secure and robust manner. Instructors
system to facilitate peer-reviewed development of assignments          specify the test environment for a task in a declarative fashion,
by our team of instructors and to enable continuous publishing         and the system builds a test image that is used by the test
of assignments. The resulting workflow helped to improve the           runner. By default, the test output is shown to the user as-is, but
quality of our assignments and tests. The architecture of INLOOP
is characterized by asynchronous solution processing and takes         pretty-printer modules may transform it into HTML fragments
advantage of container technology to achieve scalability, robust-      that can be integrated into the user interface. Additionally,
ness and isolation, while maintaining a small footprint. So far,       the system uses an offline version of the JPlag plagiarism
our approach is a success: we observed an increasing number of         checker [2] to automatically flag suspicious solutions. The
students who actively practiced programming, and they clearly          plagiarism check is executed as a batch job after a task
performed better in the written exam. The poster presents
the architecture of INLOOP and the implemented continuous              deadline has passed, in order to be able to mutually compare
publishing workflow.                                                   all submissions, including submissions from previous teaching
                                                                       periods.
                 I. M OTIVATION AND G OALS                                 A distinctive feature of INLOOP is its direct integration
                                                                       with the Git version control system: all task artifacts (meta-
   The motivation of the INLOOP approach is to improve                 data, description, diagrams, tests, etc.) are retrieved from a
teaching in software engineering by taking advantage of                remote repository, transformed if necessary, and published. For
interactive technologies. Our goals can be described from              example, such transformations include building the test image,
four perspectives: Students should be given an easy-to-use             rendering Markdown task texts, and creating ZIP archives of
environment to practice their software development skills.             supplementary material. This process is triggered by a Git push
They should receive immediate, detailed and user-friendly              directly in the IDE on the instructor’s computer; a feature we
feedback for submitted solutions. Instructors should be given          refer to as continuous publishing (Fig. 1). Before, we had to
tools to manage complex programming assignments and han-               manually edit and publish tasks in the backend of a web ap-
dle increasing numbers of students, because at such a scale,           plication, which was an error-prone process. Our approach has
individual feedback on solutions cannot be manually provided.          the following advantages: 1) the process is fully automated,
Operators are primarily concerned about the robustness and             2) all task artifacts are stored in one place, 3) changes can be
scalability of the system. Researchers and Developers want a           tracked, and 4) task artifacts can be developed in collaboration.
system that is easy to comprehend and extend.                          Another key characteristic is that continuous publishing can be
                                                                       combined with continuous integration tests and a branching
                       II. BACKGROUND                                  model such as Gitflow [3]. As an example of the latter,
                                                                       instructors can maintain a set of tasks in a protected “master”
   Since 2006, we have been supplementing our software
                                                                       branch and a derived set of tasks in a “develop” branch, for the
engineering lectures and exercises with facultative online
                                                                       current and next semester, respectively, and configure INLOOP
programming assignments. The course focuses on object-
                                                                       to use the “master” branch for publishing.1 The “develop”
oriented modeling and programming using UML and Java.
                                                                       branch could then be used to work on backward incompatible
Our system uses public JUnit tests to automatically verify
                                                                       changes which potentially break existing submissions.
solutions submitted by the students; the task complexity ranges
from basic to demanding, where students have to implement                1 Although INLOOP does not change the status of existing submissions
a design given by a UML diagram. We award students one                 when a new task revision is published, maintaining a stable task set during a
exam bonus point for each demanding task solved.                       semester is advisable to avoid student confusion.




ISEE 2018: 1st Workshop on Innovative Software Engineering Education @ SE18, Ulm, Germany                                                        32
                Fig. 1. Continuous publishing workflow.                                   Fig. 2. INLOOP architecture.



   INLOOP is designed as a multi-process application (Fig. 2).       a generic interface. It offers a flexible grading system which
The core components are the web application and the back-            our system does not provide, and its first version also featured
ground workers, both implemented in Python. An event-based           mutual reviews by students [7]. But Praktomat does not use
job broker and a relational database are used to exchange            asynchronous processing, and requires the intervention of the
data between the processes. The web application is a standard        operator to enforce resource limits and isolation. Marmoset [8]
Django application, but instead of blocking on long-running          is a related system that uses asynchronous processing and build
test runner jobs, it submits them to a job queue. The jobs are       servers on separate (virtual) machines. An interesting feature
picked up by a configurable number of background workers,            is its incentive system to motivate students to start work early
and are executed inside their own, ephemeral Docker con-             by giving them a daily budget of only two test runs.
tainer. Apart from the necessary input and output, the contain-
                                                                                              VI. F UTURE W ORK
ers are completely isolated from each other, the network, and
the host system; in addition, strict resource limits are enforced.      We plan to integrate static code analysis and style checks
                                                                     into INLOOP and ultimately reject code that violates a pre-
                      IV. E XPERIENCES                               defined coding standard. In the same spirit, the current binary
   Having bonus points as an incentive seems to be a good            solved/not solved test result could also be enhanced to give
motivation for our students. Hence, we observed an increasing        grades based on the code quality metrics calculated by such
number who practiced their development skills with INLOOP.           tools. To make it more challenging for students to solve tasks
In the summer term 2017, 11074 solutions were submitted, up          by just satisfying static test data, our unit tests could be
to 419 per day in the week before the written exam. Students         complemented with property-based testing. Another promising
who actively practiced programming online clearly performed          idea, inspired by Web-CAT [9], is to have programming
better in the written exam. But in our setting, plagiarism           assignments that focus on testing skills. For example, students
checks remain a necessity: out of 1838 valid solutions, 127          are given a specification and a software “black box”, and are
(6,9%) have been identified as plagiarism, and some of them          then asked to find as many bugs as possible by writing a test
were complete copies.                                                suite.
   With continuous publishing, we eliminated inconsistencies
by generating all published content from a single source repos-                                   R EFERENCES
itory. The streamlined workflow enabled us to spend more             [1] M. Morgenstern and D. Muhs, “INLOOP: interactive learning center for
time on actual task development, instead of time-consuming               object-oriented programming.” [Online]. Available: https://github.com/
                                                                         st-tu-dresden/inloop
and error-prone manual publishing. Furthermore, our chosen           [2] L. Prechelt, G. Malpohl, and M. Philippsen, “Finding plagiarisms among
branching model made it easy to incorporate urgent bug fixes             a set of programs with JPlag,” J. UCS, vol. 8, no. 11, p. 1016, 2002.
into the current “stable” set of tasks, while at the same time       [3] V. Driessen, “A successful git branching model.” [Online]. Available:
                                                                         http://nvie.com/git-model
being able to prepare larger source code transformations for         [4] J. Willinsky, “Open journal systems: An example of open source software
the next semester.                                                       for journal management and publishing,” Library Hi Tech, vol. 23, no. 4,
                                                                         pp. 504–519, 2005.
                    V. R ELATED W ORK                                [5] GitHub Inc., “GitHub Pages.” [Online]. Available: https://pages.github.
                                                                         com/
   There are many online programming systems available. We           [6] J. Breitner, M. Hecker, and G. Snelting, “Der Grader Praktomat,” in
are not aware of one that incorporates a continuous publishing           Automatisierte Bewertung in der Programmierausbildung. Waxmann
workflow based on version control, but the idea has been                 Verlag GmbH, 2017, no. 6, pp. 159–172.
                                                                     [7] A. Zeller, “Making students read and review code,” in Proceedings of
used in the past in other publishing systems, e.g., the Open             the 5th Annual SIGCSE/SIGCUE ITiCSEconference on Innovation and
Journal System [4]. However, GitHub Pages [5] is an example              Technology in Computer Science Education. ACM, 2000, pp. 89–92.
of a website hosting service with a Git-driven workflow, and         [8] J. Spacco, D. Hovemeyer, B. Pugh, J. Hollingsworth, N. Padua-Perez,
                                                                         and F. Emad, “Experiences with Marmoset,” Tech. Rep., 2006.
was the model for our approach. INLOOP’s closest relative            [9] S. H. Edwards and M. A. Perez-Quinones, “Web-CAT: Automatically
is the Praktomat [6], a system that supports a broad set of              grading programming assignments,” SIGCSE Bull., vol. 40, no. 3, pp.
programming languages, but uses specific checkers instead of             328–328, Jun. 2008.




ISEE 2018: 1st Workshop on Innovative Software Engineering Education @ SE18, Ulm, Germany                                                     33