Workshop Description Workshop Chairs Much of the research in LAK to date has been “student facing”, that is, using data to better understand learners and their need or to create interventions that directly support or influence learners. This workshop takes the perspective on how Learning Analytics can drive improvements in teaching practices, instructional and curricular design, and academic program delivery. While this does influence student outcomes in the long term, the data gathered and evidence generated is more instructor and administrator facing. We have seen examples of how LAK can help build the case for instructional, curricular, or programmatic change and further how LAK can be used to foster acceptance of change processes by teachers, administrators, and other stakeholders in the educational enterprise. When successful, these kinds of changes are often associated with educational reform or culture shifts in educational practice. This workshop offered those in the LAK community an opportunity to share and explore how educational data, its analysis and visualization, and the evidence derived can change/improve the context of learning. The main research and practice questions addressed in this workshop were: • How to provide relevant and actionable information to faculty, teaching assistants, departmental and college administrators to encourage a greater emphasis on student learning and use of evidence-based practices, thus encouraging a continuous improvement approach to teaching and learning. • How to create visualization and data collection tools and approaches that encourage a community of instructors and administrators to engage in making evidence-based decisions to improve student learning whether at the activity, lesson, course, series, department, college or university-wide levels. • How to extract information from the multiple modalities used in instructional environments and help capture, represent and evaluate faculty instructional approaches, student-faculty engagement, student-student interactions and student- technology interactions. • How to represent, summarize and mobilize data from human interactions and student-technology interactions to motivate change and quality improvement. Is there a way to make the data and representations more useful for promoting sustainable change? • How to change the evaluation of instructional activities to be more formative, actionable and multi-dimensional in nature, emphasizing individual and group improvement rather than a one-size fits all student survey by which instructors or courses are judged and compared. Ideally such evaluation systems would go well beyond student satisfaction as the sole measure of good teaching and should include learning outcomes, utility of outcomes, applicability to future learning, match or fit between learner and instructor, and more. • How can analytics help the individual instructor to examine the success of a course that they teach? By including the analysis of individual courses we will keep the interest of all instructors, and not just those concerned with larger questions of course sequences and curricula. Curricula are built of individual classes of course, and campus authorities at all levels need tools for examining the success and impact of individual courses. Nine submissions were received for the workshop. After being reviewed by the program committee, seven of those papers were accepted for presentation at this Workshop.